Neg Intelligence Internals

Download 0,52 Mb.
Date conversion20.08.2017
Size0,52 Mb.
  1   2   3   4   5   6   7   8   9   10


Intelligence Internals


1nc Frontline

Alt causes- HUMINT issues generate from deployment of spies and complete penetration of organizations is impossible

O’Brien, 05- President and CEO of Artemis Global Logistics & Solutions, Former Graduate Research Assistant at the Jebsen Center for Counter Terrorism Research, Former International Trade Specialist for the Department of Commerce (James, “Trojan Horses: Using Current U.S. Intelligence Resources To Successfully Infiltrate Islamic Terror Groups”, International Affairs Review Vol. 14 No.2 Fall 2005)//KTC

Nevertheless, it is easier to recognize HUMINT deficiencies than to fix them. This is especially true when reconstituting sectors spread over several agencies that have been allowed to corrode. There is no quick fix in resolving this deficiency. This reality is recognized by both policy advisors and policy-makers, who propose long-term investments in intelligence reform. A 2002 Congressional Research Service report exemplifies this mindset: While U.S. policymakers are emphasizing the need for rapid intelligence overhaul to close the HUMINT deficit, the United States is fighting a War on Terror with other countries’ unreliable eyes. 142 · International Affairs Review First is a renewed emphasis on human agents. Signals intelligence and imagery satellites have their uses in the counterterrorism mission, but intelligence to counter terrorism depends more on human intelligence (HUMINT) such as spies and informers. Any renewed emphasis on human intelligence necessarily will involve a willingness to accept risks of complicated and dangerous missions, and likely ties to disreputable individuals who may be in positions to provide valuable information. Time and patience will be needed to train analysts in difficult skills and languages.h Unfortunately, the “time and patience” necessary to develop these operatives is not a luxury the United States can afford. The 9/11 Commission Report describes the rapid nature and lack of warning that defines the current security environment: National security used to be considered by studying foreign frontiers, weighing opposing groups of states, and measuring industrial might…. Threats emerged slowly, often visibly, as weapons were forged, armies conscripted, and units trained and moved into place…. Now threats can emerge quickly. An organization like al Qaeda, headquartered in a country on the other side of the earth, in a region so poor that electricity or telephones were scarce, could nonetheless scheme to wield weapons of unprecedented destructive power in the largest cities of the United States.i Furthermore, even if the United States succeeds in developing the types of intelligence operatives with the skill sets desired for an effective war against Islamic extremists, the capacity to penetrate these groups will likely never be fully achieved. The problem is that Islamic terrorist groups are highly insulated from outside intrusion because of their family-based and/or clan-based recruitment policies: “Ethnically based terrorist groups recruit new members personally known to them, people whose backgrounds are known and who often have family ties to the organization. Intelligence penetration of organizations recruited this way is extremely difficult.”j Even those organizations that do not recruit exclusively through family ties, such as al Qaeda, still employ a severe level of vetting that places an operative’s survival in jeopardy. Regional dialects, local cultural sensitivities and “six-degrees-of-separation” within small populations all work against an operative attempting to secure a terrorist leader’s trust. Recognizing these difficulties, Rich Trojan Horses · 143 ard Betts summarizes this operational reality: “More and better spies will help, but no one should expect breakthroughs if we get them. It is close to impossible to penetrate small, disciplined, alien organizations like Osama bin Laden’s al Qaeda, and especially hard to find reliable U.S. citizens who have even a remote chance of trying.”k Nevertheless, the intelligence community should pursue HUMINT reform that will develop operatives with penetration potential, but accessing the inner circles of terror groups may take years to materialize, or may even be impossible. For example, if the operative is accepted by a terror group, he may be isolated or removed from the organization’s hierarchy, leaving the operative uninformed as to what other groups within the same organization are planning, including the cell within which he may be operating.l Therefore, recognizing the U.S. HUMINT deficiency, the lengthy process of comprehensive reform, the unpredictable nature of terrorism as a constant imminent threat, and the insulated structure of terrorist groups, the United States will need to employ creative methods to collect information without jeopardizing long-term intelligence reform. Bruce Hoffman suggests “some new, ‘out-of-the-box’ thinking that would go beyond simple bureaucratic fixes.”m One possibility is taking a backdoor approach to penetrating various fundamentalist terrorist organizations. SOLUTION PROPOSED: WORK WITH THE TOOLS WE HAVE The Backdoor One backdoor ripe for exploitation is the dependence of Islamic extremists on illicit activities and services to fund, train, and/or facilitate their operations.n The “Achilles heel” of terror groups is their dependence on criminal or other interconnected terrorist groups to provide certain services to them, specifically weapons and drug smuggling. The United States should exploit this dependence and has the capacity to do The “Achilles heel” of terror groups is their dependence on criminal or other interconnected terrorist groups to provide certain services to them, specifically weapons and drug smuggling. 144 · International Affairs Review so. This backdoor should be envisioned just as the name connotes: an alternative entrance that is easier to sneak into than the front door. In the world of computer programming, a backdoor is “an undocumented way of gaining access to a program, online service or an entire computer system. The backdoor is written by the programmer who creates the code for the program. It is often only known by the programmer. A backdoor is a potential security risk.”o When hackers discover backdoors in software programs, they exploit them. The U.S. intelligence community should adopt the hackers’ approach; infiltration agents should be looking for similar types of alternative access routes.

No NSA overload – Accumulo tech solves.

Harris ‘13

(Not Scott Harris, because large data sets do sometimes overwhelm him… But Derrick Harris. Derrick in a senior writer at Gigaom and has been a technology journalist since 2003. He has been covering cloud computing, big data and other emerging IT trends for Gigaom since 2009. Derrick also holds a law degree from the University of Nevada, Las Vegas. This evidence is also internally quoting Adam Fuchs – a former NSA employee that was involved in software design. “Under the covers of the NSA’s big data effort” – Gigaom - Jun. 7, 2013 -

The NSA’s data collection practices have much of America — and certainly the tech community — on edge, but sources familiar with the agency’s technology are saying the situation isn’t as bad as it seems. Yes, the agency has a lot of data and can do some powerful analysis, but, the argument goes, there are strict limits in place around how the agency can use it and who has access. Whether that’s good enough is still an open debate, but here’s what we know about the technology that’s underpinning all that data. The technological linchpin to everything the NSA is doing from a data-analysis perspective is Accumuloan open-source database the agency built in order to store and analyze huge amounts of data. Adam Fuchs knows Accumulo well because he helped build it during a nine-year stint with the NSA; he’s now co-founder and CTO of a company called Sqrrl that sells a commercial version of the database system. I spoke with him earlier this week, days before news broke of the NSA collecting data from Verizon and the country’s largest web companies. The NSA began building Accumulo in late 2007, Fuchs said, because they were trying to do automated analysis for tracking and discovering new terrorism suspects. “We had a set of applications that we wanted to develop and we were looking for the right infrastructure to build them on,” he said. The problem was those technologies weren’t available. He liked what projects like HBase were doing by using Hadoop to mimic Google’s famous BigTable data store, but it still wasn’t up to the NSA requirements around scalability, reliability or security. So, they began work on a project called CloudBase, which eventually was renamed Accumulo. Now, Fuchs said, “It’s operating at thousands-of-nodes scale” within the NSA’s data centers. There are multiple instances each storing tens of petabytes (1 petabyte equals 1,000 terabyes or 1 million gigabytes) of data and it’s the backend of the agency’s most widely used analytical capabilities. Accumulo’s ability to handle data in a variety of formats (a characteristic called “schemaless” in database jargon) means the NSA can store data from numerous sources all within the database and add new analytic capabilities in days or even hours. “It’s quite critical,” he added. What the NSA can and can’t do with all this data As I explained on Thursday, Accumulo is especially adept at analyzing trillions of data points in order to build massive graphs that can detect the connections between them and the strength of the connections. Fuchs didn’t talk about the size of the NSA’s graph, but he did say the database is designed to handle months or years worth of information and let analysts move from query to query very fast. When you’re talking about analyzing call records, it’s easy to see where this type of analysis would be valuable in determining how far a suspected terrorist’s network might spread and who might be involved.

Accumulo solves without violating privacy.

Henschen ‘13

Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of Transform Magazine, and Executive Editor at DM News. He has covered IT and data-driven marketing for more than 15 years. “Defending NSA Prism's Big Data Tools”- Information Week - Commentary - 6/11/2013 -

The more you know about NSA's Accumulo system and graph analysis, the less likely you are to suspect Prism is a privacy-invading fishing expedition. It's understandable that democracy-loving citizens everywhere are outraged by the idea that the U.S. Government has back-door access to digital details surrounding email messages, phone conversations, video chats, social networks and more on the servers of mainstream service providers including Microsoft, Google, Yahoo, Facebook, YouTube, Skype and Apple. But the more you know about the technologies being used by the National Security Agency (NSA), the agency behind the controversial Prism program revealed last week by whistleblower Edward Snowden, the less likely you are to view the project as a ham-fisted effort that's "trading a cherished American value for an unproven theory," as one opinion piece contrasted personal privacy with big data analysis. The centerpiece of the NSA's data-processing capability is Accumulo, a highly distributed, massively parallel processing key/value store capable of analyzing structured and unstructured data. Accumolo is based on Google's BigTable data model, but NSA came up with a cell-level security feature that makes it possible to set access controls on individual bits of data. Without that capability, valuable information might remain out of reach to intelligence analysts who would otherwise have to wait for sanitized data sets scrubbed of personally identifiable information. Sponsor video, mouseover for sound [ Want more on the Prism controversy? Read NSA Prism: Inside The Modern Surveillance State. ] As InformationWeek reported last September, the NSA has shared Accumulo with the Apache Foundation, and the technology has since been commercialized by Sqrrl, a startup launched by six former NSA employees joined with former White House cybersecurity strategy director (and now Sqrrl CE0) Ely Khan. "The reason NSA built Accumulo and didn't go with another open source project, like HBase or Cassandra, is that they needed a platform where they could tag every single piece of data with a security label that dictates how people can access that data and who can access that data," said Khan in an interview with InformationWeek. Having left government employment in 2010, Kahn says he has no knowledge of the Prism program and what information the NSA might be collecting, but he notes that Accumulo makes it possible to interrogate certain details while blocking access to personally identifiable information. This capability is likely among the things James R. Clapper, the U.S. director of National Intelligence, was referring to in a statement on the Prism disclosure that mentioned "numerous safeguards that protect privacy and civil liberties." Are They Catching Bad Guys? So the NSA can investigate data with limits, but what good is partial information? One of Accumulo's strengths is finding connections among seemingly unrelated information. "By bringing data sets together, [Accumulo] allowed us to see things in the data that we didn't necessarily see from looking at the data from one point or another," Dave Hurry, head of NSA's computer science research section, told InformationWeek last fall. Accumulo gives NSA the ability "to take data and to stretch it in new ways so that you can find out how to associate it with another piece of data and find those threats."

Aff exaggerates – NSA budget’s too small for untargeted mass data collection

Harris ‘13

(Not Scott Harris, because large data sets do sometimes overwhelm him… But Derrick Harris. Derrick in a senior writer at Gigaom and has been a technology journalist since 2003. He has been covering cloud computing, big data and other emerging IT trends for Gigaom since 2009. Derrick also holds a law degree from the University of Nevada, Las Vegas. This evidence is also internally quoting Adam Fuchs – a former NSA employee that was involved in software design. “Under the covers of the NSA’s big data effort” – Gigaom - Jun. 7, 2013 -

We’re not quite sure how much data the two programs that came to light this week are actually collecting, but the evidence suggests it’s not that muchat least from a volume perspective. Take the PRISM program that’s gathering data from web properties including Google, Facebook, Microsoft, Apple, Yahoo and AOL. It seems the NSA would have to be selective in what it grabs. Assuming it includes every cost associated with running the program, the $20 million per year allocated to PRISM, according to the slides published by the Washington Post, wouldn’t be nearly enough to store all the raw datamuch less new datasets created from analysesfrom such large web properties. Yahoo alone, I’m told, was spending over $100 million a year to operate its approximately 42,000-node Hadoop environment, consisting of hundreds of petabytes, a few years ago. Facebook users are generating more than 500 terabytes of new data every day. Using about the least-expensive option around for mass storage — cloud storage provider Backblaze’s open source storage pod designs — just storing 500 terabytes of Facebook data a day would cost more than $10 million in hardware alone over the course of a year. Using higher-performance hard drives or other premium gear — things Backblaze eschews because it’s concerned primarily about cost and scalability rather than performance — would cost even more. Even at the Backblaze price point, though, which is pocket change for the NSA, the agency would easily run over $20 million trying to store too many emails, chats, Skype calls, photos, videos and other types data from the other companies it’s working with.

Big Data can handle it.

Pontius ‘14

Brandon H. Pontius. The author holds a B.S. from Louisiana State University and an M.B.A., Louisiana State University. The author wrote this piece in partial fulfillment of a MASTER OF SCIENCE IN COMPUTER SCIENCE from the NAVAL POSTGRADUATE SCHOOL. The thesis advisor that reviewed this piece is Mark Gondree, PhD. Gondree is a security researcher associated with the Computer Science Dept at the Naval Postgraduate School – “INFORMATION SECURITY CONSIDERATIONS FOR APPLICATIONS USING APACHE ACCUMULO” - September 2014 -

Generation of actionable intelligence from large data sets requires efficient analysis. Manual analysis of large data sets to develop these insights is unsustainably resource intensive. In January 2014, the deputy director of the Defense Intelligence Agency noted, “We’re looking for needles within haystacks while trying to define what the needle is, in an era of declining resources and increasing threats” [7]. Big data platforms have the storage and analytical capabilities necessary to handle large data sets. These solutions can relieve the processing burden on human analysts and allow them to spend more time generating real intelligence [5]. Big data analytics make information more usable, improve decision making, and lead to more focused missions and services. For instance, geographically separated teams can access a real-time common operating picture, diagnostic data mining can support proactive maintenance programs that prevent battlefield failures, and data can be transformed into a common structure that allows custom queries by a distributed force composed of many communities [4], [6].

1nc Frontline Resource Wars

( ) No resource wars – too expensive and market checks

Victor ‘8

David G,- Adjunct Senior Fellow for Science and Technology, Council on Foreign Relations; Director, Program on Energy and Sustainable Development @ Stanford “Smoke and Mirror”

MY ARGUMENT is that classic resource wars—hot conflicts driven by a struggle to grab resources—are increasingly rare. Even where resources play a role, they are rarely the root cause of bloodshed. Rather, the root cause usually lies in various failures of governance. That argument—in both its classic form and in its more nuanced incarnation—is hardly a straw man, as Thomas Homer-Dixon asserts. Setting aside hyperbole, the punditry increasingly points to resources as a cause of war. And so do social scientists and policy analysts, even with their more nuanced views. I’ve triggered this debate because conventional wisdom puts too much emphasis on resources as a cause of conflict. Getting the story right has big implications for social scientists trying to unravel cause-and-effect and often even larger implications for public policy. Michael Klare is right to underscore Saddam Hussein’s invasion of Kuwait, the only classic resource conflict in recent memory. That episode highlights two of the reasons why classic resource wars are becoming rare—they’re expensive and rarely work. (And even in Kuwait’s case, many other forces also spurred the invasion. Notably, Iraq felt insecure with its only access to the sea a narrow strip of land sandwiched between Kuwait on one side and its archenemy Iran on the other.) In the end, Saddam lost resources on the order of $100 billion (plus his country and then his head) in his quest for Kuwait’s 1.5 million barrels per day of combined oil and gas output. By contrast, Exxon paid $80 billion to get Mobil’s 1.7 million barrels per day of oil and gas production—a merger that has held and flourished. As the bulging sovereign wealth funds are discovering, it is easier to get resources through the stock exchange than the gun barrel.

( ) Chinese Resource War claims are false

Victor ‘7

(David G,- Adjunct Senior Fellow for Science and Technology, Council on Foreign Relations; Director, Program on Energy and Sustainable Development @ Stanford “What Resource Wars?” 11/12

RISING ENERGY prices and mounting concerns about environmental depletion have animated fears that the world may be headed for a spate of “resource wars”—hot conflicts triggered by a struggle to grab valuable resources. Such fears come in many stripes, but the threat industry has sounded the alarm bells especially loudly in three areas. First is the rise of China, which is poorly endowed with many of the resources it needs—such as oil, gas, timber and most minerals—and has already “gone out” to the world with the goal of securing what it wants. Violent conflicts may follow as the country shunts others aside. A second potential path down the road to resource wars starts with all the money now flowing into poorly governed but resource-rich countries. Money can fund civil wars and other hostilities, even leaking into the hands of terrorists. And third is global climate change, which could multiply stresses on natural resources and trigger water wars, catalyze the spread of disease or bring about mass migrations. Most of this is bunk, and nearly all of it has focused on the wrong lessons for policy. Classic resource wars are good material for Hollywood screenwriters. They rarely occur in the real world. To be sure, resource money can magnify and prolong some conflicts, but the root causes of those hostilities usually lie elsewhere. Fixing them requires focusing on the underlying institutions that govern how resources are used and largely determine whether stress explodes into violence. When conflicts do arise, the weak link isn’t a dearth in resources but a dearth in governance.

Ext. No Resource Wars

( ) Resource wars are empirically false and won’t escalate

Homer-Dixon ‘8

(Thomas,- Chair of Peace and Conflict Studies at the Trudeau Centre for Peace and Conflict Studies at the University. of Toronto. "Oil, Oil, Toil and Trouble."– The National Interest – January /February, edition)

Rather, we argue that resource stress always interacts in complex conjunction with a host of other factors--ecological, institutional, economic and political--to cause mass violence. Also, causation is almost always indirect. People, groups and countries rarely fight over natural resources directly; instead, resource stress causes various forms of social dislocation--including widening gaps between rich and poor, increased rent-seeking by elites, weakening of states and deeper ethnic cleavages--that, in turn, make violence more likely. And, finally, this violence is almost always sub-national; it takes the form of insurgency, rebellion, gangsterism and urban criminality, not overt interstate war. The claim that resource stress is sufficient by itself to cause violence is easily refuted. One simply has to identify cases where resource stress was present but violence didn't occur. Likewise, the claim that resource stress is a necessary cause of violence is easily refuted by finding cases of violence not preceded by resource stress. At various points in his article, Victor uses exactly these strategies to debunk the link between resources and war.

( ) Best studies prove resources have very small effect on warfare.

Goldstone ‘2K

(Jack,- professor of public policy, George Mason, Population and Security: How Demographic Change Can Lead to Violent Conflict.,  JOURNAL OF INTERNATIONAL AFFAIRS, Fall2002, Vol. 56, p. 123)

For example, Wenche Hauge and Tanja Ellingsen, in the most comprehensive global test of the environmental-scarcity-leads-to-violence hypothesis with recent data (1980–92), found that while deforestation, land degradation and low freshwater availability were positively correlated with the incidence of civil war and armed conflict, the magnitude of their effects was tiny. By themselves, these factors raised the probability of civil war by 0.5 to under 1.5 percent. These factors did have a slightly higher impact on the probability of lesser kinds of armed conflict (causing increases in the chances of such conflict by from 4 percent to 8 percent); but their influence paled compared to the impact of such traditional risk factors as poverty, regime type and current and prior political instability.

Ext. HUMINT fails

HUMINT fails- intelligence officers can’t adapt

Sano 1/28 (John, 2015, Former Deputy Director, National Clandestine Service, CIA, “Guide to the Study of Intelligence: The Changing Shape of HUMINT,” Draft of paper,

Managing this younger, more technically astute, workforce can be problematic for a number of reasonsnot the least of which is the dramatic generational difference when it comes to learning. Today’s workforce thinks and processes information significantly differently from its predecessors. As Dr. Bruce Perry of Baylor College of Medicine has stated, “Different kinds of experiences lead to different brain structures.”2 As such today’s workforce receives information much faster than their predecessors. And while reception does not always equal comprehension, it does present an issue for managers as well as for IC instructors. Education within the world of HUMINT is in large measure “anecdotally based,” with instruction incorporating legacy-based scenarios, or “tribal memories” to emphasize key points. While useful, it is often a technique that many younger practitioners of espionage find unfamiliar, even ineffective. Growing up on a regular diet of technology driven information today’s clandestine officer is better connected and more adept at multitasking and networking than previous generations. Adjusting to this significant divide is often difficult, for most instructors view education in much the same way as they themselves were taught – via lectures, step-by-step logic and “tell-test” instruction. Today’s officers are more comfortable with procedures that they grew up with – TV, Internet, video cams, cell phones and all the other accruements associated with the digital age. What does this mean? Aside from the way today’s officers want to learn, it also impacts expectations. Today’s clandestine service officer expects to access any information, anytime, anywhere, and on any device. Aside from the obvious security aspects, there is also the problem of managing these expectations – attempting to inculcate the proper balance of security vs. expediency, not to mention patience within an increasingly impatient workforce is no easy task, but nonetheless critical aspect of any clandestine activity.

HUMINT is a bad form of intelligence gathering – 7 reasons

Turner ,05 -Dr. Michael A. Turner teaches at both San Diego State University and the University of San Diego. He is also a consultant to the United States Government on national security matters. Until 2006, Dr. Turner was the Director of International Relations Program at Alliant International University in San Diego, CA. Before joining Alliant, Dr. Turner was a senior CIA officer, attached both to the analytical directorate as well as to elements supporting the Director of Central Intelligence. At varying times, Dr. Turner has taught strategic affairs at the University of Maryland, the University of Virginia, John s Hopkins University, the University of Southern California, and the Air War College. Dr. Turner’s research interests include intelligence and national security, American foreign policy, Middle East as well as Central and South Asian Politics, and counterterrorism policy. Dr. Turner teaches both graduate and undergraduate international relations courses. (Michael A., “Why Secret Intelligence Fails”, pg 92, accessed 6-30-15)//KTC

On the other hand, HUMINT’s disadvantages probably outweigh its advantages. One, American case officers may not have sufficient training and know-how to perform their jobs well. According to the one analyst, CIA operatives are not particularly well prepared; they seldom speak foreign languages well and almost never know a line of business or a technical field. 13 Two, the process of recruiting spies is time consuming and lengthy, which often brings into question the benefits of such an activity in relation to its cost. Three, HUMINT information is highly perishable and therefore has a low threshold of utility. Four, HUMINT is often vulnerable to deception and double- agent operations. Five, spying is illegal everywhere, and case officers who have been caught in the process of recruitment have embarrassed the U.S. government and damaged relations with both unfriendly and friendly governments. Six, espionage is risky to the lives of the intelligence agents and their assets. Seven, because HUMINT assets are often employed in covert actions, espionage operations sometimes become corner shed in political controversies at home. Eight, many people believe that spying is ethically wrong, an activity that diminishes the moral standing of the United States around the globe.

Ext. Accumulo checks overload

( ) Data overload wrong – Accumulo tech solves

Gallagher ‘13

Sean Gallagher is the IT editor at Ars Technica. Sean is a University of Wisconsin grad, a former systems integrator, a former director of IT strategy at Ziff Davis Enterprise. He wrote his first program in high school – “What the NSA can do with “big data”” - Ars Technica - Jun 11, 2013 -

Ironically, about the same time these two programs were being exposed, Internet companies such as Google and Yahoo were solving the big data storage and analysis problem. In November of 2006, Google published a paper on BigTable, a database with petabytes of capacity capable of indexing the Web and supporting Google Earth and other applications. And the work at Yahoo to catch up with Google's GFS file system—the basis for BigTable—resulted in the Hadoop. BigTable and Hadoop-based databases offered a way to handle huge amounts of data being captured by the NSA's operations, but they lacked something critical to intelligence operations: compartmentalized security (or any security at all, for that matter). So in 2008, NSA set out to create a better version of BigTable, called Accumulo—now an Apache Foundation project. Accumulo is a "NoSQL" database, based on key-value pairs. It's a design similar to Google's BigTable or Amazon's DynamoDB, but Accumulo has special security features designed for the NSA, like multiple levels of security access. The program is built on the open-source Hadoop platform and other Apache products. One of those is called Column Visibility—a capability that allows individual items within a row of data to have different classifications. That allows users and applications with different levels of authorization to access data but see more or less information based on what each column's "visibility" is. Users with lower levels of clearance wouldn't be aware that the column of data they're prohibited from viewing existed. Accumulo also can generate near real-time reports from specific patterns in data. So, for instance, the system could look for specific words or addressees in e-mail messages that come from a range of IP addresses; or, it could look for phone numbers that are two degrees of separation from a target's phone number. Then it can spit those chosen e-mails or phone numbers into another database, where NSA workers could peruse it at their leisure. In other words, Accumulo allows the NSA to do what Google does with your e-mails and Web searches—only with everything that flows across the Internet, or with every phone call you make. It works because of a type of server process called "iterators." These pieces of code constantly process the information sent to them and send back reports on emerging patterns in the data. Querying a multi-petabyte database and waiting for a response would be deadly slow, especially because there is always new data being added. The iterators are like NSA's tireless data elves.

( ) No NSA data overload - Accumulo checks.

Kelly ‘12

Jeff Kelly is a Principal Research Contributor at The Wikibon Project and a Contributing Editor at SiliconANGLE. He focuses on trends in Big Data and business analytics. His research has been quoted and referenced by the Financial Times, Forbes,, Network World, GigaOM, TechTarget and more – “Accumulo: Why The World Needs Another NoSQL Database” – Wikibon Blog – August 20th -

If you’ve been unable to keep up with all the competing NoSQL databases that have hit the market over the last several years, you’re not alone. To name just a few, there’s HBase, Cassandra, MongoDB, Riak, CouchDB, Redis, and Neo4J. To that list you can add Accumulo, an open source database originally developed at the National Security Agency. You may be wondering why the world needs yet another database to handle large volumes of multi-structured data. The answer is, of course, that no one of these NoSQL databases has yet checked all the feature/functionality boxes that most enterprises require before deploying a new technology. In the Big Data world, that means the ability to handle the three V’s (volume, variety and velocity) of data, the ability to process multiple types of workloads (analytical vs. transactional), and the ability to maintain ACID (atomicity, consistency, isolation and durability) compliance at scale. With each new NoSQL entrant, hope springs eternal that this one will prove the NoSQL messiah. So what makes Accumulo different than all the rest? According to proponents, Accumulo is capable of maintaining consistency even as it scales to thousands of nodes and petabytes of data; it can both read and write data in near real-time; and, most importantly, it was built from the ground up with cell-level security functionality.

Ext. Accumulo solves privacy

Accumulo tech solves overload – and does so without mass privacy violations.

Jackson ‘13

Joab Jackson covers enterprise software and general technology breaking news for the IDG News Service, and is based in New York. “NSA's Accumulo data store has strict limits on who can see the data” - PC World - Oct 31, 2013 -

With its much-discussed enthusiasm for collecting large amounts of data, the NSA naturally found much interest in the idea of highly scalable NoSQL databases. But the U.S. intelligence agency needed some security of its own, so it developed a NoSQL data store called Accumulo, with built-in policy enforcement mechanisms that strictly limit who can see its data. At the O’Reilly Strata-Hadoop World conference this week in New York, one of the former National Security Agency developers behind the software, Adam Fuchs, explained how Accumulo works and how it could be used in fields other than intelligence gathering. The agency contributed the software’s source code to the Apache Software Foundation in 2011. “Every single application that we built at the NSA has some concept of multi-level security,” said Fuchs, who is now the chief technology officer of Sqrrl, which offers a commercial edition of the software. The NSA started building Accumulo in 2008. Much like Facebook did with its Cassandra database around the same time, the NSA used the Google Big Table architecture as a starting point. In the parlance of NoSQL databases, Accumulo is a simple key/value data store, built on a shared-nothing architecture that allows for easy expansion to thousands of nodes able to hold petabytes worth of data. It features a flexible schema that allows new columns to be quickly added, and comes with some advanced data analysis features as well. Accumulo's killer feature Accumulo’s killer feature, however, is its “data-centric security,” Fuchs said. When data is entered into Accumulo, it must be accompanied with tags specifying who is allowed to see that material. Each row of data has a cell specifying the roles within an organization that can access the data, which can map back to specific organizational security policies. It adheres to the RBAC (role-based access control) model. This approach allowed the NSA to categorize data into its multiple levels of classification—confidential, secret, top secret—as well as who in an organization could access the data, based on their official role within the organization. The database is accompanied by a policy engine that decides who can see what data. This model could be used anywhere that security is an issue. For instance, if used in a health care organization, Accumulo can specify that only a patient and the patient’s doctor can see the patient’s data. The patient’s specific doctor may change over time, but the role of the doctor, rather than the individual doctor, is specified in the database. The NSA found that the data-centric approach “greatly simplifies application development,” Fuchs said. Because data today tends to be transformed and reused for different analysis applications, it makes sense for the database itself to keep track of who is allowed to see the data, rather than repeatedly implementing these rules in each application that uses this data.
  1   2   3   4   5   6   7   8   9   10

The database is protected by copyright © 2016
send message

    Main page