11/4/11

Apple might end the Mac Pro line

According to arstechnica
Apple’s top-of-the-line workstation, the Mac Pro, hasn’t been updated in well over a year, waiting on Intel’s delayed (and delayed again) Sandy Bridge-E based Xeon processors. The previous upgrade cycle was equally as long, nearly 18 months. Now, rumors are circulating that Apple management has been contemplating just pulling the plug on the Mac Pro altogether.
According to sources speaking to AppleInsider, a planned Mac Pro revision has been in the works for quite some time, but Apple management has been debating its fate as far back as May 2011. Sales of the high-end workstation have dropped considerably to both consumers and enterprise customers, and so profits have taken a nosedive.
Our own Peter Bright contends that the enterprise just isn’t interested in expensive workstations on the whole, and those on the bleeding edge of hardcore performance generally aren’t looking at a Mac. That theory seems to jibe with the rumored low sales figures for Mac Pros; portables are an increasingly large part of Apple's Mac sales—now nearly three-quarters—and desktop sales are primarily iMacs, according to Apple.
With sales so low and profits dwindling, is the Mac Pro just an expensive anachronism? It may be so for some users, especially those who value portability over raw power, don’t require upgrades, and whose expansion needs are served by the PCI Express-based Thunderbolt port Apple introduced across its line over the past year. A few users who previously relied on Mac Pros told Ars they have already traded in for the latest svelte MacBook Air models, for instance.

Not ready to give up

Hardcore Mac Pro users aren’t ready for Apple to give up on them just yet, though. We spoke to a number of professionals, largely in the content creation business, who told Ars that iMacs and Mac minis just aren’t the right solution for their needs.
Jon Alper, a Boston-based independent consultant for various media production interests, suggested he would prefer that his Mac Pros were pried from his cold, dead hands. “I just can't fathom functioning without them,” he told Ars. “I have two sitting on my desk right now, both mostly crunching video.”
Alper cut his teeth crunching numbers on Mac hardware at Harvard Medical School and later managed the IT needs of WGBH Boston's roughly 70-seat Interactive production division. “We rotated new Mac Pros in about every 12 months, and the older machines would get passed around to users that had older machines yet,” Alper said. “Any given machine would typically have a usable life of three to five years, and redeploying to new users was so simple—just pull the drive sled, swap, and redeploy.”
“Managing four terabytes of video data, running twelve to twenty-four hour long effects rendering batches—you just can’t really do that without a Mac Pro,” Alper said.
iMacs or Mac minis just aren’t a suitable replacement in production environments like WGBH Interactive, Apler insists. “Unless you have to pull the motherboard, Mac Pros are absurdly easy to work on,” Alper said. Swapping drives, adding RAM, or adding PCI Express cards or GPUs are relatively simple tasks on a Mac Pro; the same can't be said for even for most PC towers. “I just don't want to have to find a tech dexterous enough to pull the glass off an iMac with suction cups when just about anyone can pull and replace a drive in a Mac Pro,” he said.
For those at the bleeding edge of design, content creation, and scientific computing, the Mac Pro offers a number of advantages over Apple's other hardware. Dual multicore processors, enough slots for obscene amounts of RAM, the ability to run internal RAIDs, customizable GPUs, and the ability to expand functionality with PCI Express cards were all cited by users as reasons to keep the Mac Pro around.
Dr. David Chen of the Office of High Performance Computing and Communications at the National Library of Medicine told Ars that his small five-person team has relied exclusively on Mac Pros using a large Xsan file store. If Apple discontinues the Mac Pro “I’d miss not having an NVIDIA Quadro,” Chen said. “It’s got lots of memory and seems very bulletproof. And my coworker’s Pro has 64 gig of RAM; he is always going to want more memory.”
IT systems administrator and longtime Mac gamer Tom Johnson told Ars that video card options are critical for him on a personal level. “I have generally bought tower cases so I could replace the video card,” he said. “I expect to get five years out of the Mac, but only two out of a video card.”
“The other thing that is nice about Mac Pros is the dual processor. On the high end you get eight cores—I just don't see Apple putting that in any iMac,” Johnson said.
Other users appreciate Apple's use of higher-end Xeon processors. Web developer Enrique Ortiz, a former Microsoft Systems Engineer, noted that Xeons are typically reserved for server hardware. “There are very few systems equivalent to this machine in the Windows environment for the desktop,” he said. “In my experience, only the hardcore geeks could set up a great system like this running Windows or Linux.”
“If Apple kills the Mac Pro I would be devastated—Mac OS X is a cleaner environment for me and I’d really hate to go back,” Ortiz said.
Developers also often rely on Mac Pros to shave significant time off the complex app building process. “I use a Mac Pro at the office to build Mac and iPhone apps,” developer Raphael Sebbe said. “Working with Xcode, which is highly parallel, makes full use of the cores. My Mac Pro has eight cores—16 virtual ones—and does a fresh build about five times faster than on my MacBook Pro.”

Creating Windows switchers

Apple killing the Mac Pro could reverse some of the the switching trend that the company has relied on to expand its user base. Apple claims on nearly every earnings call that about half of new Mac buyers at its retail stores were previously PC users, and those switchers contribute to its quarterly Mac sales records.
Some diehard Mac users just won't switch back. “Could I switch to Windows? Yes, since Adobe makes its Creative Suite for Windows,” graphic designer Christopher Cobble told Ars. “Would I switch? Not even if I had to use a Mac mini.”
But other users wouldn't be able to get by with less expandable, less flexible iMacs or Mac minis. If Apple drops the Mac Pro, Alper said, “I’m gonna buy the biggest, fastest one I can find and just wait. And then hope Windows 8 is as awesome as it’s promised to be. Other Mac hardware just doesn’t have the flexibility and control that I need.”
And researchers in Chen's group are already contemplating a switch to Linux. “Our new post-doc is a Linux person, so he's ordering a pimped out machine from Colfax Systems,” he said. “Another coworker said if the Pros go, he's going Linux.”

"Dire" long-term consequences?

If Apple does decide to kill the Mac Pro, Alper believes the ill effects will extend beyond users' immediate needs. “The risk is dire, in my opinion,” he told Ars. “When Apple does things that make it easier for IT guys to say ‘no’ to Apple hardware, they do themselves a disservice. Things like the consumerization of Lion Server, they make it easier for the corporate Windows IT guy to just say ‘no,’ but when they have one of the best personal computers on the market, the Mac Pro, it makes it easier to say ‘yes.’”
“Employees have been gaining grassroots support for Macs by bringing in their own machines. Those guys in IT, they will use the Mac Pro's death as a reason to cut support,” Alper said.
Unfortunately, as iPhones and the iOS ecosystem has come to represent 70 percent of Apple's revenue, what little enterprise support it has offered in the past has waned. “Apple has been abandoning the enterprise market for years,” Dan Reshef, Director of Information Technology at CUNY Graduate School of Journalism, told Ars. “It began with the end of life of the XRAID, followed by the server class machines, and the Mac Pro could be next.”
“Apple has historically been a computer company driven by an interest to provide a platform for content creation,” Reshef continued. “However, Apple has a [recent] history of simplifying and eliminating products that were a drain on its resources. The enterprise products are likely a resource drain they'd prefer to allocate towards more profitable pursuits, even if it means damaging their own ecosystem and abandoning some of the content creators in the process.”

11/3/11

Google Maps to charge for usage

According to BBC:
From 1 January 2012, Google will charge for the Google Maps API service when more than the limit of 25,000 map "hits" are made in a day.
Websites, especially travel firms, use Google Maps to link customers to a view of the destinations they inquire about.
Google is rumoured to be charging $4 per 1,000 views in excess of the limit.
Google maintains the high limit of 25,000 free hits before charging "will only affect 0.35% of users".

'Secure future'

Google said it was aware that developers needed time to evaluate their usage, determine if they were affected and then take action as appropriate.
"We understand that the introduction of these limits may be concerning," said Thor Mitchell, product manager of the Maps API at Google.
"However, with the continued growth in adoption of the Maps API, we need to secure its long-term future by ensuring that even when used by the highest-volume for-profit sites, the service remains viable. "

11/2/11

U.S. Government never took time to understand cancer concerns for Airport X-Ray Scanners

According to ProPublica:

One after another, the experts convened by the Food and Drug Administration raised questions about the machine because it violated a longstanding principle in radiation safety — that humans shouldn’t be X-rayed unless there is a medical benefit.
“I think this is really a slippery slope,” said Jill Lipoti, who was the director of New Jersey’s radiation protection program. The device was already deployed in prisons; what was next, she and others asked — courthouses, schools, airports? “I am concerned … with expanding this type of product for the traveling public,” said another panelist, Stanley Savic, the vice president for safety at a large electronics company. “I think that would take this thing to an entirely different level of public health risk.”
The machine’s inventor, Steven W. Smith, assured the panelists that it was highly unlikely that the device would see widespread use in the near future. At the time, only 20 machines were in operation in the entire country.
“The places I think you are not going to see these in the next five years is lower-security facilities, particularly power plants, embassies, courthouses, airports and governments,” Smith said. “I would be extremely surprised in the next five to 10 years if the Secure 1000 is sold to any of these.”
Today, the United States has begun marching millions of airline passengers through the X-ray body scanners, parting ways with countries in Europe and elsewhere that have concluded that such widespread use of even low-level radiation poses an unacceptable health risk. The government is rolling out the X-ray scanners despite having a safer alternative that the Transportation Security Administration says is also highly effective.
A ProPublica/PBS NewsHour investigation of how this decision was made shows that in post-9/11 America, security issues can trump even long-established medical conventions. The final call to deploy the X-ray machines was made not by the FDA, which regulates drugs and medical devices, but by the TSA, an agency whose primary mission is to prevent terrorist attacks.
Research suggests that anywhere from six to 100 U.S. airline passengers each year could get cancer from the machines. Still, the TSA has repeatedly defined the scanners as “safe,” glossing over the accepted scientific view that even low doses of ionizing radiation — the kind beamed directly at the body by the X-ray scanners — increase the risk of cancer.
“Even though it’s a very small risk, when you expose that number of people, there’s a potential for some of them to get cancer,” said Kathleen Kaufman, the former radiation management director in Los Angeles County, who brought the prison X-rays to the FDA panel’s attention.
About 250 X-ray scanners are currently in U.S. airports, along with 264 body scanners that use a different technology, a form of low-energy radio waves known as millimeter waves.
Robin Kane, the TSA’s assistant administrator for security technology, said that no one would get cancer because the amount of radiation the X-ray scanners emit is minute. Having both technologies is important to create competition, he added.
“It’s a really, really small amount relative to the security benefit you’re going to get,” Kane said. “Keeping multiple technologies in play is very worthwhile for the U.S. in getting that cost-effective solution — and being able to increase the capabilities of technology because you keep everyone trying to get the better mousetrap.”
Determined to fill a critical hole in its ability to detect explosives, the TSA plans to have one or the other operating at nearly every security lane in America by 2014. The TSA has designated the scanners for “primary” screening: Officers will direct every passenger, including children, to go through either a metal detector or a body scanner, and the passenger’s only alternative will be to request a physical pat-down.
How did the United States swing from considering such X-rays taboo to deeming them safe enough to scan millions of people a year?
A new wave of terrorist attacks using explosives concealed on the body, coupled with the scanners’ low dose of radiation, certainly convinced many radiation experts that the risk was justified.
But other factors helped the machines gain acceptance.
Because of a regulatory Catch-22, the airport X-ray scanners have escaped the oversight required for X-ray machines used in doctors’ offices and hospitals. The reason is that the scanners do not have a medical purpose, so the FDA cannot subject them to the rigorous evaluation it applies to medical devices.
Still, the FDA has limited authority to oversee some non-medical products and can set mandatory safety regulations. But the agency let the scanners fall under voluntary standards set by a nonprofit group heavily influenced by industry.
As for the TSA, it skipped a public comment period required before deploying the scanners. Then, in defending them, it relied on a small body of unpublished research to insist the machines were safe, and ignored contrary opinions from U.S. and European authorities that recommended precautions, especially for pregnant women. Finally, the manufacturer, Rapiscan Systems, unleashed an intense and sophisticated lobbying campaign, ultimately winning large contracts.
Both the FDA and TSA say due diligence has been done to assure the scanners’ safety. Rapiscan says it won the contract because its technology is superior at detecting threats. While the TSA says X-ray and millimeter-wave scanners are both effective, Germany decided earlier this year not to roll out millimeter-wave machines after finding they produced too many false positives.
Most of the news coverage on body scanners has focused on privacy, because the machines can produce images showing breasts and buttocks. But the TSA has since installed software to make the images less graphic. While some accounts have raised the specter of radiation, this is the first report to trace the history of the scanners and document the gaps in regulation that allowed them to avoid rigorous safety evaluation.
Little research on cancer risk of body scanners
Humans are constantly exposed to ionizing radiation, a form of energy that has been shown to strip electrons from atoms, damage DNA and mutate genes, potentially leading to cancer. Most radiation comes from radon, a gas produced from naturally decaying elements in the ground. Another major source is cosmic radiation from outer space. Many common items, such as smoke detectors, contain tiny amounts of radioactive material, as do exit signs in schools and office buildings.
As a result, the cancer risk from any one source of radiation is often small. Outside of nuclear accidents, such as that at Japan's Fukushima plant, and medical errors, the health risk comes from cumulative exposure.
In Rapiscan’s Secure 1000 scanner, which uses ionizing radiation, a passenger stands between two large blue boxes and is scanned with a pencil X-ray beam that rapidly moves left to right and up and down the body. In the other machine, ProVision, made by defense contractor L-3 Communications, a passenger enters a chamber that looks like a round phone booth and is scanned with millimeter waves, a form of low-energy radio waves, which have not been shown to strip electrons from atoms or cause cancer.
Only a decade ago, many states prohibited X-raying a person for anything other than a medical exam. Even after 9/11, such non-medical X-raying remains taboo in most of the industrialized world. In July, the European Parliament passed a resolution that security “scanners using ionizing radiation should be prohibited” because of health risks. Although the United Kingdom uses the X-ray machine for limited purposes, such as when passengers trigger the metal detector, most developed countries have decided to forgo body scanners altogether or use only the millimeter-wave machines.
While the research on medical X-rays could fill many bookcases, the studies that have been done on the airport X-ray scanners, known as backscatters, fill a file no more than a few inches thick. None of the main studies cited by the TSA has been published in a peer-reviewed journal, the gold standard for scientific research.
Those tests show that the Secure 1000 delivers an extremely low dose of radiation, less than 10 microrems. The dose is roughly one-thousandth of a chest X-ray and equivalent to the cosmic radiation received in a few minutes of flying at typical cruising altitude. The TSA has used those measurements to say the machines are “safe.”
Most of what researchers know about the long-term health effects of low levels of radiation comes from studies of atomic bomb survivors in Hiroshima and Nagasaki. By charting exposure levels and cancer cases, researchers established a linear link that shows the higher the exposure, the greater risk of cancer.
Some scientists argue the danger is exaggerated. They claim low levels stimulate the repair mechanism in cells, meaning that a little radiation might actually be good for the body.
But in the authoritative report on low doses of ionizing radiation, published in 2006, the National Academy of Sciences reviewed the research and concluded that the preponderance of research supported the linear link. It found “no compelling evidence” that there is any level of radiation at which the risk of cancer is zero.
Radiation experts say the dose from the backscatter is negligible when compared to naturally occurring background radiation. Speaking to the 1998 FDA panel, Smith, the inventor, compared the increased risk to choosing to visit Denver instead of San Diego or the decision to wear a sweater versus a sport coat.
Using the linear model, even such trivial amounts increase the number of cancer cases. Rebecca Smith-Bindman, a radiologist at the University of California, San Francisco, estimated that the backscatters would lead to only six cancers over the course of a lifetime among the approximately 100 million people who fly every year. David Brenner, director of Columbia University’s Center for Radiological Research, reached a higher number — potentially 100 additional cancers every year.
“Why would we want to put ourselves in this uncertain situation where potentially we’re going to have some cancer cases?” Brenner asked. “It makes me think, really, why don’t we use millimeter waves when we don’t have so much uncertainty?”
But even without the machines, Smith-Bindman said, the same 100 million people would develop 40 million cancers over the course of their lifetimes. In this sea of cancer cases, it would be impossible to identify the patients whose cancer is linked to the backscatter machines.
How the scanners avoided strict oversight
Although they deliberately expose humans to radiation, the airport X-ray scanners are not medical devices, so they are not subject to the stringent regulations required for diagnostic X-ray machines. 
If they were, the manufacturer would have to submit clinical data showing safety and effectiveness and be approved through a rigorous process by the FDA. If the machines contained radioactive material, they would have to report to the Nuclear Regulatory Commission.
But because it didn’t fit into either category, the Secure 1000 was classified as an electronic product. The FDA does not review or approve the safety of such products. However, manufacturers must provide a brief radiation safety report explaining the dose and notify the agency if any overexposure is discovered. According to the FDA, no such incidents have been reported.
Under its limited oversight of electronic products, the FDA could issue mandatory safety regulations. But it didn’t do so, a decision that flows from its history of supervising electronics. 
Regulation of electronic products in the United States began after a series of scandals. From the 1930s to the 1950s, it was common for a child to go to a shoe store and stand underneath an X-ray machine known as a fluoroscope to check whether a shoe was the right fit. But after cases arose of a shoe model’s leg being amputated and store clerks developing dermatitis from putting their hands in the beam to adjust the shoe, the practice ended.
In 1967, General Electric recalled 90,000 color televisions that had been sold without the proper shielding, potentially exposing viewers to dangerous levels of radiation. The scandal prompted the creation of the federal Bureau of Radiological Health.
“That ultimately led to a lot more aggressive program,” said John Villforth, who was the director of the bureau. Over the next decade, the bureau created federal safety standards for televisions, medical X-rays, microwaves, tanning beds, even laser light shows.
But in 1982, the FDA merged the radiological health bureau into its medical-device unit.
“I was concerned that if they were to combine the two centers into one, it would probably mean the ending of the radiation program because the demands for medical-device regulation were becoming increasingly great,” said Villforth, who was put in charge of the new Center for Devices and Radiological Health. “As I sort of guessed, the radiation program took a big hit.”
The new unit became stretched for scarce resources as it tried to deal with everything from tongue depressors to industrial lasers. The government used to have 500 people examining the safety of electronic products emitting radiation. It now has about 20 people. In fact, the FDA has not set a mandatory safety standard for an electronic product since 1985.
As a result, there is an FDA safety regulation for X-rays scanning baggage — but none for X-rays scanning people at airports.
Meanwhile, scientists began developing backscatter X-rays, in which the waves are reflected off an object to a detector, for the security industry.
The Secure 1000 people scanner was invented by Smith in 1991 and later sold to Rapiscan, then a small security firm based in southern California. The first major customer was the California prison system, which began scanning visitors to prevent drugs and weapons from getting in. But the state pulled the devices in 2001 after a group of inmates' wives filed a class-action lawsuit accusing the prisons of violating their civil liberties.
The U.S. Customs Service deployed backscatter machines for several years but in limited fashion and with strict supervision. Travelers suspected of carrying contraband had to sign a consent form, and Customs policy prohibited the scanning of pregnant women. The agency abandoned them in 2006, not for safety reasons but because smugglers had learned where the machines were installed and adapted their methods to avoid them, said Rick Whitman, the radiation safety officer for Customs until 2008.
Yet, even this limited application of X-ray scanning for security dismayed radiation safety experts. In 1999, the Conference of Radiation Control Program Directors, a nongovernmental organization, passed a resolution recommending that such screening be stopped immediately.
The backscatter machines had also caught the attention of the 1998 FDA advisory panel, which recommended that the FDA establish government safety regulations for people scanners. Instead, the FDA decided to go with a voluntary standard set by a trade group largely comprising manufacturers and government agencies that wanted to use the machines.
“Establishing a mandatory standard takes an enormous amount of resources and could take a decade to publish,” said Dan Kassiday, a longtime radiation safety engineer at the FDA.
In addition, since the mid-1990s, Congress has directed federal safety agencies to use industry standards wherever possible instead of creating their own.
The FDA delegated the task of establishing the voluntary standards to the American National Standards Institute. A private nonprofit that sets standards for many industries, ANSI convened a committee of the Health Physics Society, a trade group of radiation safety specialists. It was made up of 15 people, including six representatives of manufacturers of X-ray body scanners and five from U.S. Customs and the California prison system. There were few government regulators and no independent scientists.
In contrast, the FDA advisory panel was also made up of 15 people — five representatives from government regulatory agencies, four outside medical experts, one labor representative and five experts from the electronic products industry, but none from the scanner manufacturers themselves.
“I am more comfortable with having a regulatory agency — either federal or the states — develop the standards and enforce them,” Kaufman said. Such regulators, she added, “have only one priority, and that’s public health.”
A representative of the Health Physics Society committee said that was its main priority as well. Most of the committee’s evaluation was completed before 9/11. The standard was published in 2002 and updated with minor changes in 2009.
Ed Bailey, chief of California’s radiological health branch at the time, said he was the lone voice opposing the use of the machines. But after 9/11, his views changed about what was acceptable in pursuit of security.
“The whole climate of their use has changed,” Bailey said. “The consequence of something being smuggled on an airplane is far more serious than somebody getting drugs into a prison.”
Are Inspections Independent?
While the TSA doesn’t regulate the machines, it must seek public input before making major changes to security procedures. In July, a federal appeals court ruled that the agency failed to follow rule-making procedures and solicit public comment before installing body scanners at airports across the country. TSA spokesman Michael McCarthy said the agency couldn’t comment on ongoing litigation.
The TSA asserts there is no need to take additional precautions for sensitive populations, even pregnant women, following the guidance of the congressionally chartered National Council on Radiation Protection & Measurements.
But other authorities have come to the opposite conclusion. A report by France’s radiation safety agency specifically warned against screening pregnant women with the X-ray devices. In addition, the Federal Aviation Administration’s medical institute has advised pregnant pilots and flight attendants that the machine, coupled with their time in the air, could put them over their occupational limit for radiation exposure and that they might want to adjust their work schedules accordingly.
No similar warning has been issued for pregnant frequent fliers.
Even as people scanners became more widespread, government oversight actually weakened in some cases.
Inspections of X-ray equipment in hospitals and industry are the responsibility of state regulators — and before 9/11, many states also had the authority to randomly inspect machines in airports. But that ended when the TSA took over security checkpoints from the airlines.
Instead, annual inspections are done by Rapiscan, the scanners’ manufacturer.
“As a regulator, I think there’s a conflict of interest in having the manufacturer and the facility inspect themselves,” Kaufman said.
Last year, in reaction to public anger from members of Congress, passengers and advocates, the TSA contracted with the Army Public Health Command to do independent radiation surveys. But email messages obtained in a lawsuit brought by the Electronic Privacy Information Center, a civil liberties group, raise questions about the independence of the Army surveys.
One email sent by TSA health and safety director Jill Segraves shows that local TSA officials were given advance notice and allowed to “pick and choose” which systems the Army could check.
That email also suggests that Segraves considered the Army inspectors a valuable public-relations asset: “They are our radiation myth busters,” she wrote to a local security director.
Some TSA screeners are concerned about their own radiation exposure from the backscatters, but the TSA has not allowed them to wear badges that could measure it, said Milly Rodriguez, health and safety specialist for the American Federation of Government Employees, which represents TSA officers.
“We have heard from members that sometimes the technicians tell them that the machines are emitting more radiation than is allowed,” she said.
McCarthy, the TSA spokesman, said the machines are physically incapable of producing radiation above the industry standard. In the email, he said, the inspections allow screeners to ask questions about radiation and address concerns about specific machines.
To read more please click on this link.

U.S. Satellites hacked!

According to Bloomberg:

Computer hackers, possibly from the Chinese military, interfered with two U.S. government satellites four times in 2007 and 2008 through a ground station in Norway, according to a congressional commission.
The intrusions on the satellites, used for earth climate and terrain observation, underscore the potential danger posed by hackers, according to excerpts from the final draft of the annual report by the U.S.-China Economic and Security Review Commission. The report is scheduled to be released next month.
“Such interference poses numerous potential threats, particularly if achieved against satellites with more sensitive functions,” according to the draft. “Access to a satellite‘s controls could allow an attacker to damage or destroy the satellite. An attacker could also deny or degrade as well as forge or otherwise manipulate the satellite’s transmission.”
A Landsat-7 earth observation satellite system experienced 12 or more minutes of interference in October 2007 and July 2008, according to the report.
Hackers interfered with a Terra AM-1 earth observation satellite twice, for two minutes in June 2008 and nine minutes in October that year, the draft says, citing a closed-door U.S. Air Force briefing.
The draft report doesn’t elaborate on the nature of the hackers’ interference with the satellites.

Chinese Military Writings

U.S. military and intelligence agencies use satellites to communicate, collect intelligence and conduct reconnaissance. The draft doesn’t accuse the Chinese government of conducting or sponsoring the four attacks. It says the breaches are consistent with Chinese military writings that advocate disabling an enemy’s space systems, and particularly “ground-based infrastructure, such as satellite control facilities.”
U.S. authorities for years have accused the Chinese government of orchestrating cyber attacks against adversaries and hacking into foreign computer networks to steal military and commercial secrets. Assigning definitive blame is difficult, the draft says, because the perpetrators obscure their involvement.
The commission’s 2009 report said that “individuals participating in ongoing penetrations of U.S. networks have Chinese language skills and have well established ties with the Chinese underground hacker community,” although it acknowledges that “these relationships do not prove any government affiliation.”

Chinese Denials

China this year “conducted and supported a range of malicious cyber activities,” this year’s draft reports. It says that evidence emerging this year tied the Chinese military to a decade-old cyber attack on a U.S.-based website of the Falun Gong spiritual group.
Chinese officials long have denied any role in computer attacks.
The commission has “been collecting unproved stories to serve its purpose of vilifying China’s international image over the years,” said Wang Baodong, a spokesman for the Chinese Embassy in Washington, in a statement. China “never does anything that endangers other countries’ security interests.”
The Chinese government is working with other countries to clamp down on cyber crime, Wang said.
Defense Department reports of malicious cyber activity, including incidents in which the Chinese weren’t the main suspect, rose to a high of 71,661 in 2009 from 3,651 in 2001, according to the draft. This year, attacks are expected to reach 55,110, compared with 55,812 in 2010.

Relying on the Internet

In the October 2008 incident with the Terra AM-1, which is managed by the National Aeronautics and Space Administration, “the responsible party achieved all steps required to command the satellite,” although the hackers never exercised that control, according to the draft.
The U.S. discovered the 2007 cyber attack on the Landsat-7, which is jointly managed by NASA and the U.S. Geological Survey, only after tracking the 2008 breach.
The Landsat-7 and Terra AM-1 satellites utilize the commercially operated Svalbard Satellite Station in Spitsbergen, Norway that “routinely relies on the Internet for data access and file transfers,” says the commission, quoting a NASA report.
The hackers may have used that Internet connection to get into the ground station’s information systems, according to the draft.
While the perpetrators of the satellite breaches aren’t known for sure, other evidence uncovered this year showed the Chinese government’s involvement in another cyber attack, according to the draft.

TV Report

A brief July segment on China Central Television 7, the government’s military and agricultural channel, indicated that China’s People’s Liberation Army engineered an attack on the Falun Gong website, the draft said.
The website, which was hosted on a University of Alabama at Birmingham computer network, was attacked in 2001 or earlier, the draft says.
The CCTV-7 segment said the People’s Liberation Army’s Electrical Engineering University wrote the software to carry out the attack against the Falun Gong website, according to the draft. The Falun Gong movement is banned by the Chinese government, which considers it a cult.
After initially posting the segment on its website, CCTV-7 removed the footage after media from other countries began to report the story, the congressional draft says.

Military Disruption

The Chinese military also has been focused on its U.S. counterpart, which it considers too reliant on computers. In a conflict, the Chinese would try to “compromise, disrupt, deny, degrade, deceive or destroy” U.S. space and computer systems, the draft says.
“This could critically disrupt the U.S. military’s ability to deploy and operate during a military contingency,” according to the draft.
Other cyber intrusions with possible Chinese involvement included the so-called Night Dragon attacks on energy and petrochemical companies and an effort to compromise the Gmail accounts of U.S. government officials, journalists and Chinese political activists, according to the draft.
Often the attacks are found to have come from Chinese Internet-protocol, or IP, addresses.
Businesses based in other countries and operating in China think that computer network intrusions are among the “most serious threats to their intellectual property,” the draft says.
The threat extends to companies not located in China. On March 22, U.S. Internet traffic was “improperly” redirected through a network controlled by Beijing-based China Telecom Corp. Ltd., the state-owned largest provider of broadband Internet connections in the country, the draft said.
In its draft of last year’s report, the commission highlighted China’s ability to direct Internet traffic and exploit “hijacked” data.