This is the
talk page for discussing improvements to the
Utah Data Center article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
This is text from the Camp Williams artcile that was deleted (I have restored) but has several older refs re the 'Utah Data Centre', which was apparently previously called Community Comprehensive National Cybersecurity Initiative Data Center.
220 of Borg 10:24, 9 March 2014 (UTC) nb. Restored text from April 2012
{{
cite news}}
: Check date values in: |date=
(
help)
- 220 of Borg 04:52, 21 April 2012 (UTC)
About this added paragraph:-- Tomwsulcer ( talk) 18:35, 27 August 2012 (UTC)
My concerns are regarding WP:OR and WP:UNDUE. Can the new paragraph addition be shortened to only those aspects that explicitly refer to the Utah Data Center?-- Tomwsulcer ( talk) 21:00, 25 August 2012 (UTC)
http://www.dbia-mar.org/downloads/Army%20DBIA%20Mar16%202010.pdf references Army Core of Engineers project 21078 in a report from March 16, 2012, listing the project as $ 1,489,000,000 and calling it "Bumblehive UDC (Incr 1- 5)" - UDC probably stands for Utility Data Center. 67.188.202.139 ( talk) 20:34, 2 December 2012 (UTC)
Multiple yotabytes? Absurd. That exceeds the worldwide platter production by a an order of magnitude (annual production runs around 500M). At 300Mb/sec, it would take 10 million years to write that much data. (yes, I recognize this would be done in parallel, I'm trying to put a scope on the problem). On tape, the only feasible solution (due to power/heat issues), it would take more in the 100M year range. Multiple drives would be failing every second. Come on, this is supposed to be an Encyclopedia.
All of these stories come from the same CNET source, which misread a statement that we need to eventually handle a yottabyte of information. Handle, not store. I quote the relevant information "as a 2007 Department of Defense report puts it, the Pentagon is attempting to expand its worldwide communications network, known as the Global Information Grid, to handle yottabytes (1024 bytes) of data." That's a world wide network, not the Utah facility, and the word used is 'handle'.
I am editing the article to remove the yottabyte claim.
— Preceding unsigned comment added by 71.92.252.196 ( talk) 17:11, 14 June 2013 (UTC)
I agree, the yottabyte claim is completely absurd. I did some calculations on this. A yottabyte would require a data center would cost $10 Trillion a 1 cent per GB, almost the whole GDP of the united states. And judging from the density of magnetic storage medium, it would also have to be 15 miles across, which is clearly counter indicated by the photons of it. Here are my calculations. Pulu ( talk) 21:58, 21 July 2013 (UTC)
Surely the article is in error. 1 Yottabyte = 1 Trillion Terabytes. At a price of $50 a terabyte (very conservative estimate) this would put the cost of the data center at $50 trillion dollars which far exceeds the federal budget of 3.6 trillion. — Preceding unsigned comment added by 71.201.194.191 ( talk) 22:03, 17 April 2013 (UTC)
Regarding cost, remember that the buyer is not your ordinary consumer! —
Charles Edwin Shipp (
talk)
14:12, 8 June 2013 (UTC)
I think the news reports regarding the Utah facility are fabrication of real data presented by the US government to another media source and designed to invite the public into entertainment reagarding the power-house of funding. The article is not NPOV and should be edited by an expert from the Department of Justice. Fatum81 ( talk) 02:58, 9 June 2013 (UTC)
"A yottabyte is so big as to be nearly unimaginable by casual computer users: It’s enough information to fill 200 trillion DVDs." . . .
"The companies participating in PRISM produce enormous amounts of data every day, so storing it would require computing power the likes of which the public has never seen. People who study technology and security believe that’s why the NSA [built] a million-square-foot data center near Salt Lake City."
http://www.theblaze.com/stories/2013/06/07/11-questions-you-probably-have-about-u-s-domestic-spying-answered/
Charles Edwin Shipp (
talk)
14:09, 8 June 2013 (UTC)
Here are the prefixes you can use, from Binary_prefix:
Yottabyte is the last one listed, so is the expected data even larger?
I’ll leave it to others to say how soon the NSA Utah Data Center will fill up.
Another measure is the Googol (100 zeros) — Charles Edwin Shipp ( talk) 16:01, 8 June 2013 (UTC)
To avoid saying 'yottabytes' one computer scientist at the University of Utah says instead, "thousands of zettabytes". [2] Funny? Charles Edwin Shipp ( talk)
Just in case you want to discuss if this dimension of data makes any sense: skeptics.stackexchange -- MartinThoma ( talk) 05:14, 3 July 2013 (UTC)
Thanks, that was interesting reading (skimming) and one 'take-away quote' is: "An NSA spokeswoman says the actual data capacity of the center is classified." Another two thoughts I had while reading/skimming: (1) to think about the size/capacity, ask Edward Snowden who has inside knowledge and indications are that such sites (multiple sites) are collecting every eMail etc and storing the text not just the metadata. (2) So how much would that take? (3) It is a military strategy to not reveal your power/capability. — Charles Edwin Shipp ( talk) 13:35, 5 July 2013 (UTC) PS: The storage estimates for Facebook and particle accelerators were very interesting also.
Sources such as the NSA's mission statement, or remarks by generals, are essentially primary sources which can be used on some occasions but only with caution, and in this case, I think we need more references. A recent change says essentially that the NSA's own declaration of its own mission is factually correct. Here is the current wording:-- Tomwsulcer ( talk) 13:55, 18 April 2013 (UTC)
My problem is we really need objective, reliable, secondary sources to back this up, and that we should not take the NSA at face-value here. Or, saying the NSA's declaration is, indeed, the case, is us committing original research, which is why I think we need to state that the NSA conceives its mission as such and such, but leave open alternate possibilities.-- Tomwsulcer ( talk) 13:55, 18 April 2013 (UTC)
If useful, please use the following for references...
The state flag of Utah highlights a beehive, — motto means industriousness. The website (apparently from the US government) http://nsa.gov1.info/utah-data-center/ identifies the codename and mentions 'Bumblehive' 24 times. FYI, Charles Edwin Shipp ( talk) 01:37, 9 June 2013 (UTC)
Much of this content was derived from news media, privacy groups, and government websites. Links to these sites are posted on the left-sidebars of each page." So I would conclude that 'Bumblehive' was never a military codename. Charles Edwin Shipp ( talk) 14:09, 9 June 2013 (UTC)
I removed a bunch of original research (mostly stuff about FISA and the law, which isn't really about this article. Basically the entire "possible purposes" section was OR.) However, one sentence I removed seems like it should still be included. But the source it was cited to was incorrect, so I was wondering if anyone knows the real source so we can add the sentence back in. The relevant text is: NSA whistleblower William Binney alleged that the Bluffdale facility was designed to store a broad range of domestic communications for data mining without warrants. Capscap ( talk) 08:16, 9 June 2013 (UTC)
Here is a reference implying use: "NSA Whistleblower Speaks Out on Verizon, PRISM, and the Utah Data Center" [3] — Charles Edwin Shipp ( talk) 14:12, 9 June 2013 (UTC)
There might be some new information here:
Capscap ( talk) 08:24, 9 June 2013 (UTC)
If you read the following article,
a reader makes this correction: "For the record, they are Intel “Xeon” processors. Not “Xenon Core” processors."
Here's an interesting item showing that Apple Co. needs support for its iCloud and iPhone/iPad.
http://blogs.wsj.com/digits/2013/07/01/apple-invests-in-solar-farm-for-nevada-data-center/?mod=trending_now_5
Charles Edwin Shipp (
talk)
19:09, 2 July 2013 (UTC)
Our reference (to the Thomas Burr piece) at the bottom of our WP article notes that, "The Utah Data Center will be part of NSA’s interconnected network that includes sites in Colorado, Georgia and Maryland, and since the Utah facility will be the largest, there is a good chance Americans’ phone call data could land in the Bluffdale site at least temporarily. "I wouldn’t say I know it for a fact," says Steven Aftergood, director of the Federation of American Scientist’s Project on Government Secrecy. But "when you build a facility of that scale, it’s probably meant to be used, and the storage and processing of large volumes of collected data would seem to be a plausible use of this facility." — Charles Edwin Shipp ( talk) 19:19, 2 July 2013 (UTC)
Lynnette and I will be traveling from Utah Valley to Salt Lake Valley via 'Point of the Mountain' pass. I plan to stop and see what I can find out about the openhouse and if there will be tours. We were at the Pentagon and there were tours, but you needed reservations with a three-day advance notice to for clearance checking. Are there other questions I should ask the guard or the voice system? Charles Edwin Shipp ( talk) 03:07, 7 August 2013 (UTC)
Because of Edward Snowden leaks, expect a lot of new information to come in between now and the NSA Utah Data Center opening (said to be in October, but some articles say "September or October": probably operational in September with ribbon-cutting & public tours in October.) I'll be there since my wife and I have a lot of family in American Fork, Utah County, and Salt Lake County. FYI, the facility is on the county line; on Camp Williams Army Base, next to the state prison.
This just in:
Charles Edwin Shipp ( talk) 19:40, 2 July 2013 (UTC)
I moved these comments orphaned at the top without a subhead. 96.227.66.159 ( talk) 23:36, 10 July 2013 (UTC)
A local story from Bluffdale. The center is expected to use 1.7 million gallons of water per day when fully operational.
http://www.ksl.com/?sid=25978926&nid=148
-- 71.20.55.6 ( talk) 16:14, 13 July 2013 (UTC)
For the Great Yottabyte debate.
http://hothardware.com/News/Hard-Drive-Capacity-Could-Increase-to-60TB-by-2016-IHS-iSuppli-Says/ -- 71.20.55.6 ( talk) 07:27, 14 July 2013 (UTC)
This may or may not qualify as Original research. Nonetheless, I'm including it here.
The center is expected to use 65 Megawatts of electricity. By knowing the power consumption of large capacity drives, you can put an upper limit on the storage by assuming that all electricity used will power HDD drives. Much of the power will go to cooling, and equipment besides drives, of course.
The lowest average operating power consumption for 4TB Seagate Enterprise drives is 11.27Watts. [1] This results in a storage capacity of 23 Exabytes [2].
Assuming that capacity is more important than speed, a lower power model using only 6.49 Watts can be found [3] resulting in 40Exabytes. Assuming that 60TB drives show up a few years down the line that only use 6.5 watts, the capacity would be around 600Exabytes.
Switching to SSDs is a costly proposition, but significantly reduces the power consumption per byte (and also the actual dimensions are less, and the drives are a much faster). A 960GB model that uses 0.6 Watts would result in 104Exabytes.
Whatever the scenario, the storage capacity of that facility should still expected to be in Exabytes, rather than Yottabytes, given available technology.
-- 71.20.55.6 ( talk) 20:47, 15 July 2013 (UTC)
Power consumption is probably not a useful measure of storage capacity. If they use hard drives they would only need to have a few of them turned on to read and write while the rest are not consuming power. They could conceivably be using modern magnetic tape storage, similar to the way the LHC's experiments store their data. Pulu ( talk) 22:13, 21 July 2013 (UTC)
References
How about "thousands of zettabytes" or millions of exebytes? — Charles Edwin Shipp ( talk) 00:20, 16 July 2013 (UTC)
That said,
William Binney believes the capacity may be on the order of 5 Zetabytes. He's made this claim at least twice, from what I've seen in videos of his speeches and interviews. Most recently 7-19-2013.
http://english.cntv.cn/program/newsupdate/20130719/104150.shtml
-- 71.20.55.6 ( talk) 09:39, 21 July 2013 (UTC)
The diagram of the data center for this page roughly matches the diagram from wired magazine. But neither of these resemble the photos of the facility seen here and here. There is only one row of buildings and the data halls are oriented differently. These two photos are consistent with the photo on the wiki. Pulu ( talk) 22:26, 21 July 2013 (UTC)
Brewster Kahle estimate: 12 exabytes. Paul Vixie estimate: 3 exabytes
-- 71.20.55.6 ( talk) 02:31, 25 July 2013 (UTC)
Anon 3 days ago
How does one go about obtaining permission to upload the blueprints to Wikipedia for the article about the Utah datacenter?
Called-out comment
Reply Author Kashmir Hill Kashmir Hill, Forbes Staff 3 days ago
They are government docs so I don’t think we’d claim ownership of them.
Called-out comment
http://www.forbes.com/sites/kashmirhill/2013/07/24/blueprints-of-nsa-data-center-in-utah-suggest-its-storage-capacity-is-less-impressive-than-thought/ — Preceding unsigned comment added by 71.20.55.6 ( talk) 07:16, 29 July 2013 (UTC)
In August 2012, The New York Times published short documentaries by independent filmmakers entitled The Program,[7] based on interviews with a whistleblower named William Binney, a designer of the NSA's Stellar Wind project.
This is a bit misleading. Binney designed a program called Thinthread, the backend of which was later bastardized by a third party contractor and integrated into Trailblazer, and again bastardized and inserted into Stellar Wind. Binney had left NSA by the time Stellar Wind got going. It's not entirely unfair to categorize Binney as a designer of Stellar Wind, (and he sometimes describes himself as such) but it's not a complete picture. Stellar Wind was the reason for the James Comey rebellion in 2004, in which Ashcroft, gravely ill in hospital, made the decision to allow Comey to kill the project. The primary sources are Binney's Interviews, one with Laura Poitras, referenced above. Tom Drake is another source.
There is a question of how dead Stellar Wind really is:
http://www.washingtonpost.com/investigations/us-surveillance-architecture-includes-collection-of-revealing-internet-phone-metadata/2013/06/15/e9bf004a-d511-11e2-b05f-3ea3f0e7bb5a_story.html -- 71.20.55.6 ( talk) 19:38, 29 July 2013 (UTC)
If you do a Google-search with the search terms, [ Utah data center openhouse october 2013 ] you will see that the opening ceremony for the Utah Data Center [NSA Utah Data Center] has been postponed, if you can believe the information found by Wall Street Journal (WSJ), Reuters, and the Salt Lake Tribune. In my skimming yesterday, I saw the the group protecting the constitution "Restore the Fourth [Amendment], Utah" [5] marched in protest a few days ago, (can you say sparks were flying?) and a day or two ago, it was announced that the October ribbon-cutting ceremony would be postponed for maybe a year, if you believe it. [6]
In a Salt Lake Tribune article, one reporter/writer notes that the Utah Data Center may be open already. [7]
If you do the Google-search suggested, a favorite site of mine has excellent leads but is a parody of the NSA.gov site: (so view it for leads, but not for referencing here): =[;-) :: [8] They have excellent pictures of the site! Etc .!.
From NSA, their bottom line: "Washington • The National Security Agency says electrical problems at its Utah Data Center were not as dire as reported earlier this week, didn’t damage any expensive computer equipment and shouldn’t delay the opening of the massive storage site. Our current assessment is this issue will be fully resolved, mission systems will be installed on schedule, and the project will remain within budget," NSA’s director for installations and logistics, Harvey Davis, says in a letter sent to congressional intelligence committees this week. The Salt Lake Tribune obtained a summary and discussion of the letter." [9] -- Charles Edwin Shipp ( talk) 09:11, 14 October 2013 (UTC)
Hi, should we replace older map with more accurate verison? ` a5b ( talk) 02:42, 31 December 2013 (UTC)
The defenders against cyber-attacks draws other attacks to the Utah gov't even though not attached.
Headline-1: Massive Utah cyberattacks — up to 300 million per day — may be aimed at NSA facility
QUOTE: "10,000-fold increase » Since the facility was built, Utah government has had up to 300 million attempted attacks a day." -- AstroU ( talk) 23:19, 7 February 2015 (UTC) -- PS: FYI for future editing. New NEWS today, for future editing
Concerns, criticism, and also ambivalence, of mass-surveillance continues with citizens in Utah and across the nation, according to this subscription WSJ article.
Headline: A Top-Secret NSA Site Draws Swipes, Shrugs
QUOTE: "Utah’s reaction to data center mirrors wider ambivalence over surveillance programs -- As a defiant statement against what it sees as government overreach, a group of Utahns “adopted” the desert highway that leads to the National Security Agency’s secretive and sprawling new facility in Bluffdale." -- Wall Street Journal (online) give this brief thumbnail, but requires a subscription to read the entire article. -- Narnia.Gate7 ( talk) 00:45, 2 May 2015 (UTC)
As of October 2019, why are there so many phrases that start with "is expected to," and why is the facility referred to as "projected," if it's already been built? 76.189.141.37 ( talk) 23:59, 5 October 2019 (UTC)
This is the
talk page for discussing improvements to the
Utah Data Center article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
![]() | This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
This is text from the Camp Williams artcile that was deleted (I have restored) but has several older refs re the 'Utah Data Centre', which was apparently previously called Community Comprehensive National Cybersecurity Initiative Data Center.
220 of Borg 10:24, 9 March 2014 (UTC) nb. Restored text from April 2012
{{
cite news}}
: Check date values in: |date=
(
help)
- 220 of Borg 04:52, 21 April 2012 (UTC)
About this added paragraph:-- Tomwsulcer ( talk) 18:35, 27 August 2012 (UTC)
My concerns are regarding WP:OR and WP:UNDUE. Can the new paragraph addition be shortened to only those aspects that explicitly refer to the Utah Data Center?-- Tomwsulcer ( talk) 21:00, 25 August 2012 (UTC)
http://www.dbia-mar.org/downloads/Army%20DBIA%20Mar16%202010.pdf references Army Core of Engineers project 21078 in a report from March 16, 2012, listing the project as $ 1,489,000,000 and calling it "Bumblehive UDC (Incr 1- 5)" - UDC probably stands for Utility Data Center. 67.188.202.139 ( talk) 20:34, 2 December 2012 (UTC)
Multiple yotabytes? Absurd. That exceeds the worldwide platter production by a an order of magnitude (annual production runs around 500M). At 300Mb/sec, it would take 10 million years to write that much data. (yes, I recognize this would be done in parallel, I'm trying to put a scope on the problem). On tape, the only feasible solution (due to power/heat issues), it would take more in the 100M year range. Multiple drives would be failing every second. Come on, this is supposed to be an Encyclopedia.
All of these stories come from the same CNET source, which misread a statement that we need to eventually handle a yottabyte of information. Handle, not store. I quote the relevant information "as a 2007 Department of Defense report puts it, the Pentagon is attempting to expand its worldwide communications network, known as the Global Information Grid, to handle yottabytes (1024 bytes) of data." That's a world wide network, not the Utah facility, and the word used is 'handle'.
I am editing the article to remove the yottabyte claim.
— Preceding unsigned comment added by 71.92.252.196 ( talk) 17:11, 14 June 2013 (UTC)
I agree, the yottabyte claim is completely absurd. I did some calculations on this. A yottabyte would require a data center would cost $10 Trillion a 1 cent per GB, almost the whole GDP of the united states. And judging from the density of magnetic storage medium, it would also have to be 15 miles across, which is clearly counter indicated by the photons of it. Here are my calculations. Pulu ( talk) 21:58, 21 July 2013 (UTC)
Surely the article is in error. 1 Yottabyte = 1 Trillion Terabytes. At a price of $50 a terabyte (very conservative estimate) this would put the cost of the data center at $50 trillion dollars which far exceeds the federal budget of 3.6 trillion. — Preceding unsigned comment added by 71.201.194.191 ( talk) 22:03, 17 April 2013 (UTC)
Regarding cost, remember that the buyer is not your ordinary consumer! —
Charles Edwin Shipp (
talk)
14:12, 8 June 2013 (UTC)
I think the news reports regarding the Utah facility are fabrication of real data presented by the US government to another media source and designed to invite the public into entertainment reagarding the power-house of funding. The article is not NPOV and should be edited by an expert from the Department of Justice. Fatum81 ( talk) 02:58, 9 June 2013 (UTC)
"A yottabyte is so big as to be nearly unimaginable by casual computer users: It’s enough information to fill 200 trillion DVDs." . . .
"The companies participating in PRISM produce enormous amounts of data every day, so storing it would require computing power the likes of which the public has never seen. People who study technology and security believe that’s why the NSA [built] a million-square-foot data center near Salt Lake City."
http://www.theblaze.com/stories/2013/06/07/11-questions-you-probably-have-about-u-s-domestic-spying-answered/
Charles Edwin Shipp (
talk)
14:09, 8 June 2013 (UTC)
Here are the prefixes you can use, from Binary_prefix:
Yottabyte is the last one listed, so is the expected data even larger?
I’ll leave it to others to say how soon the NSA Utah Data Center will fill up.
Another measure is the Googol (100 zeros) — Charles Edwin Shipp ( talk) 16:01, 8 June 2013 (UTC)
To avoid saying 'yottabytes' one computer scientist at the University of Utah says instead, "thousands of zettabytes". [2] Funny? Charles Edwin Shipp ( talk)
Just in case you want to discuss if this dimension of data makes any sense: skeptics.stackexchange -- MartinThoma ( talk) 05:14, 3 July 2013 (UTC)
Thanks, that was interesting reading (skimming) and one 'take-away quote' is: "An NSA spokeswoman says the actual data capacity of the center is classified." Another two thoughts I had while reading/skimming: (1) to think about the size/capacity, ask Edward Snowden who has inside knowledge and indications are that such sites (multiple sites) are collecting every eMail etc and storing the text not just the metadata. (2) So how much would that take? (3) It is a military strategy to not reveal your power/capability. — Charles Edwin Shipp ( talk) 13:35, 5 July 2013 (UTC) PS: The storage estimates for Facebook and particle accelerators were very interesting also.
Sources such as the NSA's mission statement, or remarks by generals, are essentially primary sources which can be used on some occasions but only with caution, and in this case, I think we need more references. A recent change says essentially that the NSA's own declaration of its own mission is factually correct. Here is the current wording:-- Tomwsulcer ( talk) 13:55, 18 April 2013 (UTC)
My problem is we really need objective, reliable, secondary sources to back this up, and that we should not take the NSA at face-value here. Or, saying the NSA's declaration is, indeed, the case, is us committing original research, which is why I think we need to state that the NSA conceives its mission as such and such, but leave open alternate possibilities.-- Tomwsulcer ( talk) 13:55, 18 April 2013 (UTC)
If useful, please use the following for references...
The state flag of Utah highlights a beehive, — motto means industriousness. The website (apparently from the US government) http://nsa.gov1.info/utah-data-center/ identifies the codename and mentions 'Bumblehive' 24 times. FYI, Charles Edwin Shipp ( talk) 01:37, 9 June 2013 (UTC)
Much of this content was derived from news media, privacy groups, and government websites. Links to these sites are posted on the left-sidebars of each page." So I would conclude that 'Bumblehive' was never a military codename. Charles Edwin Shipp ( talk) 14:09, 9 June 2013 (UTC)
I removed a bunch of original research (mostly stuff about FISA and the law, which isn't really about this article. Basically the entire "possible purposes" section was OR.) However, one sentence I removed seems like it should still be included. But the source it was cited to was incorrect, so I was wondering if anyone knows the real source so we can add the sentence back in. The relevant text is: NSA whistleblower William Binney alleged that the Bluffdale facility was designed to store a broad range of domestic communications for data mining without warrants. Capscap ( talk) 08:16, 9 June 2013 (UTC)
Here is a reference implying use: "NSA Whistleblower Speaks Out on Verizon, PRISM, and the Utah Data Center" [3] — Charles Edwin Shipp ( talk) 14:12, 9 June 2013 (UTC)
There might be some new information here:
Capscap ( talk) 08:24, 9 June 2013 (UTC)
If you read the following article,
a reader makes this correction: "For the record, they are Intel “Xeon” processors. Not “Xenon Core” processors."
Here's an interesting item showing that Apple Co. needs support for its iCloud and iPhone/iPad.
http://blogs.wsj.com/digits/2013/07/01/apple-invests-in-solar-farm-for-nevada-data-center/?mod=trending_now_5
Charles Edwin Shipp (
talk)
19:09, 2 July 2013 (UTC)
Our reference (to the Thomas Burr piece) at the bottom of our WP article notes that, "The Utah Data Center will be part of NSA’s interconnected network that includes sites in Colorado, Georgia and Maryland, and since the Utah facility will be the largest, there is a good chance Americans’ phone call data could land in the Bluffdale site at least temporarily. "I wouldn’t say I know it for a fact," says Steven Aftergood, director of the Federation of American Scientist’s Project on Government Secrecy. But "when you build a facility of that scale, it’s probably meant to be used, and the storage and processing of large volumes of collected data would seem to be a plausible use of this facility." — Charles Edwin Shipp ( talk) 19:19, 2 July 2013 (UTC)
Lynnette and I will be traveling from Utah Valley to Salt Lake Valley via 'Point of the Mountain' pass. I plan to stop and see what I can find out about the openhouse and if there will be tours. We were at the Pentagon and there were tours, but you needed reservations with a three-day advance notice to for clearance checking. Are there other questions I should ask the guard or the voice system? Charles Edwin Shipp ( talk) 03:07, 7 August 2013 (UTC)
Because of Edward Snowden leaks, expect a lot of new information to come in between now and the NSA Utah Data Center opening (said to be in October, but some articles say "September or October": probably operational in September with ribbon-cutting & public tours in October.) I'll be there since my wife and I have a lot of family in American Fork, Utah County, and Salt Lake County. FYI, the facility is on the county line; on Camp Williams Army Base, next to the state prison.
This just in:
Charles Edwin Shipp ( talk) 19:40, 2 July 2013 (UTC)
I moved these comments orphaned at the top without a subhead. 96.227.66.159 ( talk) 23:36, 10 July 2013 (UTC)
A local story from Bluffdale. The center is expected to use 1.7 million gallons of water per day when fully operational.
http://www.ksl.com/?sid=25978926&nid=148
-- 71.20.55.6 ( talk) 16:14, 13 July 2013 (UTC)
For the Great Yottabyte debate.
http://hothardware.com/News/Hard-Drive-Capacity-Could-Increase-to-60TB-by-2016-IHS-iSuppli-Says/ -- 71.20.55.6 ( talk) 07:27, 14 July 2013 (UTC)
This may or may not qualify as Original research. Nonetheless, I'm including it here.
The center is expected to use 65 Megawatts of electricity. By knowing the power consumption of large capacity drives, you can put an upper limit on the storage by assuming that all electricity used will power HDD drives. Much of the power will go to cooling, and equipment besides drives, of course.
The lowest average operating power consumption for 4TB Seagate Enterprise drives is 11.27Watts. [1] This results in a storage capacity of 23 Exabytes [2].
Assuming that capacity is more important than speed, a lower power model using only 6.49 Watts can be found [3] resulting in 40Exabytes. Assuming that 60TB drives show up a few years down the line that only use 6.5 watts, the capacity would be around 600Exabytes.
Switching to SSDs is a costly proposition, but significantly reduces the power consumption per byte (and also the actual dimensions are less, and the drives are a much faster). A 960GB model that uses 0.6 Watts would result in 104Exabytes.
Whatever the scenario, the storage capacity of that facility should still expected to be in Exabytes, rather than Yottabytes, given available technology.
-- 71.20.55.6 ( talk) 20:47, 15 July 2013 (UTC)
Power consumption is probably not a useful measure of storage capacity. If they use hard drives they would only need to have a few of them turned on to read and write while the rest are not consuming power. They could conceivably be using modern magnetic tape storage, similar to the way the LHC's experiments store their data. Pulu ( talk) 22:13, 21 July 2013 (UTC)
References
How about "thousands of zettabytes" or millions of exebytes? — Charles Edwin Shipp ( talk) 00:20, 16 July 2013 (UTC)
That said,
William Binney believes the capacity may be on the order of 5 Zetabytes. He's made this claim at least twice, from what I've seen in videos of his speeches and interviews. Most recently 7-19-2013.
http://english.cntv.cn/program/newsupdate/20130719/104150.shtml
-- 71.20.55.6 ( talk) 09:39, 21 July 2013 (UTC)
The diagram of the data center for this page roughly matches the diagram from wired magazine. But neither of these resemble the photos of the facility seen here and here. There is only one row of buildings and the data halls are oriented differently. These two photos are consistent with the photo on the wiki. Pulu ( talk) 22:26, 21 July 2013 (UTC)
Brewster Kahle estimate: 12 exabytes. Paul Vixie estimate: 3 exabytes
-- 71.20.55.6 ( talk) 02:31, 25 July 2013 (UTC)
Anon 3 days ago
How does one go about obtaining permission to upload the blueprints to Wikipedia for the article about the Utah datacenter?
Called-out comment
Reply Author Kashmir Hill Kashmir Hill, Forbes Staff 3 days ago
They are government docs so I don’t think we’d claim ownership of them.
Called-out comment
http://www.forbes.com/sites/kashmirhill/2013/07/24/blueprints-of-nsa-data-center-in-utah-suggest-its-storage-capacity-is-less-impressive-than-thought/ — Preceding unsigned comment added by 71.20.55.6 ( talk) 07:16, 29 July 2013 (UTC)
In August 2012, The New York Times published short documentaries by independent filmmakers entitled The Program,[7] based on interviews with a whistleblower named William Binney, a designer of the NSA's Stellar Wind project.
This is a bit misleading. Binney designed a program called Thinthread, the backend of which was later bastardized by a third party contractor and integrated into Trailblazer, and again bastardized and inserted into Stellar Wind. Binney had left NSA by the time Stellar Wind got going. It's not entirely unfair to categorize Binney as a designer of Stellar Wind, (and he sometimes describes himself as such) but it's not a complete picture. Stellar Wind was the reason for the James Comey rebellion in 2004, in which Ashcroft, gravely ill in hospital, made the decision to allow Comey to kill the project. The primary sources are Binney's Interviews, one with Laura Poitras, referenced above. Tom Drake is another source.
There is a question of how dead Stellar Wind really is:
http://www.washingtonpost.com/investigations/us-surveillance-architecture-includes-collection-of-revealing-internet-phone-metadata/2013/06/15/e9bf004a-d511-11e2-b05f-3ea3f0e7bb5a_story.html -- 71.20.55.6 ( talk) 19:38, 29 July 2013 (UTC)
If you do a Google-search with the search terms, [ Utah data center openhouse october 2013 ] you will see that the opening ceremony for the Utah Data Center [NSA Utah Data Center] has been postponed, if you can believe the information found by Wall Street Journal (WSJ), Reuters, and the Salt Lake Tribune. In my skimming yesterday, I saw the the group protecting the constitution "Restore the Fourth [Amendment], Utah" [5] marched in protest a few days ago, (can you say sparks were flying?) and a day or two ago, it was announced that the October ribbon-cutting ceremony would be postponed for maybe a year, if you believe it. [6]
In a Salt Lake Tribune article, one reporter/writer notes that the Utah Data Center may be open already. [7]
If you do the Google-search suggested, a favorite site of mine has excellent leads but is a parody of the NSA.gov site: (so view it for leads, but not for referencing here): =[;-) :: [8] They have excellent pictures of the site! Etc .!.
From NSA, their bottom line: "Washington • The National Security Agency says electrical problems at its Utah Data Center were not as dire as reported earlier this week, didn’t damage any expensive computer equipment and shouldn’t delay the opening of the massive storage site. Our current assessment is this issue will be fully resolved, mission systems will be installed on schedule, and the project will remain within budget," NSA’s director for installations and logistics, Harvey Davis, says in a letter sent to congressional intelligence committees this week. The Salt Lake Tribune obtained a summary and discussion of the letter." [9] -- Charles Edwin Shipp ( talk) 09:11, 14 October 2013 (UTC)
Hi, should we replace older map with more accurate verison? ` a5b ( talk) 02:42, 31 December 2013 (UTC)
The defenders against cyber-attacks draws other attacks to the Utah gov't even though not attached.
Headline-1: Massive Utah cyberattacks — up to 300 million per day — may be aimed at NSA facility
QUOTE: "10,000-fold increase » Since the facility was built, Utah government has had up to 300 million attempted attacks a day." -- AstroU ( talk) 23:19, 7 February 2015 (UTC) -- PS: FYI for future editing. New NEWS today, for future editing
Concerns, criticism, and also ambivalence, of mass-surveillance continues with citizens in Utah and across the nation, according to this subscription WSJ article.
Headline: A Top-Secret NSA Site Draws Swipes, Shrugs
QUOTE: "Utah’s reaction to data center mirrors wider ambivalence over surveillance programs -- As a defiant statement against what it sees as government overreach, a group of Utahns “adopted” the desert highway that leads to the National Security Agency’s secretive and sprawling new facility in Bluffdale." -- Wall Street Journal (online) give this brief thumbnail, but requires a subscription to read the entire article. -- Narnia.Gate7 ( talk) 00:45, 2 May 2015 (UTC)
As of October 2019, why are there so many phrases that start with "is expected to," and why is the facility referred to as "projected," if it's already been built? 76.189.141.37 ( talk) 23:59, 5 October 2019 (UTC)