NASA's Apollo technology has changed history

Forty years after astronauts on NASA's Apollo 11 spacecraft first landed on the moon, many experts say the historic event altered the course of space exploration as well man's view of itself in the universe.



The Apollo missions also had another major affect on the world -- rapidly accelerating the pace of technology development. The work of NASA engineers at the time caused a dramatic shift in electronics and computing systems, scientists say.

questions for analytics vendors
10 questions to ask analytics vendors (before you buy)
Analytics purchase decisions are littered with opportunities for missteps. These 10 questions will help
READ NOW
Without the research and development that went into those space missions, top companies like Intel Corp. may not have been founded, and the population likely wouldn't be spending a big chunk of work and free time using laptops and Blackberries to post information on Facebook or Twitter.

"During the mid- to late-1960s, when Apollo was being designed and built, there was significant advancement," said Scott Hubbard, who worked at NASA for 20 years before joining the faculty at Stanford University, where he is a professor in the aeronautics and astronautics department. "Power consumption. Mass. Volume. Data rate. All the things that were important to making space flight feasible led to major changes in technology. A little told story is how much NASA, from the Cold War up through the late '80s or early '90s affected technology."

It's fairly well-known that technology developed by NASA scientists routinely makes its way into products developed in the robotics, computer hardware and software, nanotechnology, aeronautics, transportation and health care industries. While the story that Tang, the bright orange powdered beverage, was developed for astronauts is just a myth, many other advancements - think micro-electromechanical systems, supercomputers and microcomputers, software and microprocessors - were also created using technology developed by NASA over the past half century.

Hubbard noted that overall, $7 or $8 in goods and services are still produced for every $1 that the government invests in NASA.


But the string of Apollo missions alone -- which ran from the ill-fated, never-flown Apollo 1 mission in 1967 to Apollo 17, the last to land men on the moon, in 1972 - had a critical, and often overlooked impact on technology at a key time in the computer industry.

Daniel Lockney, the editor of Spinoff, NASA's annual publication that reports on the use of the agency's technologies in the private sector, said the advancements during the Apollo missions were staggering.

"There were remarkable discoveries in civil, electrical, aeronautical and engineering science, as well as rocketry and the development of core technologies that really pushed technology into the industry it is today," he said. "It was perhaps one of the greatest engineering and scientific feats of all time. It was huge. The engineering required to leave Earth and move to another heavenly body required the development of new technologies that before hadn't even been thought of. It has yet to be rivaled."

Lockney cited several technologies that can be directly linked engineering work done for the Apollo missions.

Software designed to manage a complex series of systems onboard the capsules is an ancestor to the software that today is used in retail credit card swipe devices, he said. And race car drivers and firefighters today use liquid-cooled garments based on on the devices created for Apollo astronauts to wear under their spacesuits. And the freeze-dried foods developed for Apollo astronauts to eat in space are used today in military field rations, known as MREs, and as part of survival gear.

WHAT READERS LIKE
10 Signs Layoffs Are Coming
Sen. Durbin calls Abbott Labs' IT layoffs 'harsh and insensitive’
New iPhone 6s Plus in box
First look: The new iPhone 6S Plus impresses
Credit card on fire
ACLU: Orwellian Citizen Score, China's credit score system, is a warning for...
And those technologies are just a drop in the bucket to importance of the development of the integrated circuit, and the emergence of Silicon Valley, which were very closely linked to the Apollo program.

The development of that integrated circuit, the forbearer to the microchip, basically is a miniaturized electronic circuit that did away with the manual assembly of separate transistors and capacitors. Revolutionizing electronics, integrated circuits are used in nearly all electronic equipment today.

While Robert Noyce, co-founder of Fairchild Semiconductor and then Intel Corp. is credited with co-founding the microchip, Jack Kilby of Texas Instruments demonstrated the first working integrated circuit that was built for the U.S. Department of Defense and NASA.

NASA, according to Lockney, set the parameters of what it needed out of the technology and then Kilby designed it. Kilby later won the Nobel Prize in Physics for for creating the technology.

"The co-investment between defense and civilian space was very real and hugely important," said Hubbard.

"With Apollo, they needed to cut down on weight and power consumption. Mass into space equals money," he said. "It has been and continues to be about $10,000 a pound to get to lower Earth orbit. They certainly don't want computers that take up basketball courts. They want something very powerful and very light that doesn't take massive power. That was one of the driving requirements that led to the development of the integrated circuit, where you put all the components on a chip rather than having a board stuffed with individual transistors and other circuit components."

He added that the microchip took the high-tech industry to a place of mass production and economies of scale.

"There was a major shift in electronics and computing and at least half credit goes to Apollo," said Hubbard. "Without it, you wouldn't have a laptop. You'd still have things like the Univac."

Dubai - a city of hi-tech contrasts

The United Arab Emirates (UAE) is known for its opulent seven-star hotels, tall buildings and shopping centres symbolic of a society driven by aspiration.
As the country's centre of innovation, Dubai is leading the way in technological advances, including in motoring.



The city has a free flow tolling system so drivers are automatically charged at toll gates using Radio Frequency Identification (RFID) technology.
The RFID chip, linked to a prepaid toll account, rests in a windscreen and deducts the fee when a car passes under the sensor.

Visitors opting for public transport will find the modern and driverless Dubai Metro ahead of the times in comparison to most other countries.
Passengers in Gold Class can surf the net onboard via wi-fi, while the train itself is connected to the internet using Wimax technology.

Young innovation

The youth of Dubai has embraced consumer technology and attempts to innovate in this sector have started emerging.
One company has created a memory stick which paired with a computer can stream the PC's media files as if they were on the stick itself.
This means a movie can be played back wirelessly on a media device such as a PlayStation without the need to copy any files.

Ahmad Zahran, the founder of Infinitec, said Dubai has a long way to go before it becomes a fertile ground for technology entrepreneurs.
"Investors would still rather invest in real estate than they would in an IT company. The concept of R&D'ing something out of the Middle East is just not understood. The thinking is why not just buy it in from China and export it?" he said.


'Bling' handsets

International brands have adopted the city as the place to experiment with new designs aimed exclusively at the Arab market.

Mobile phone

Young people are adopting mobile phones with modern designs
"Well, there are different segments obviously. There is a very predominant segment which would go for what you would call 'bling' design," said Hamad Malik, regional marketing director at LG. "People here like golden colour mobiles, pink colour mobiles do very well."
The Blackberry in particular has found a cultural niche in what still is a conservative society where pre-marital liaisons between boys and girls are strongly discouraged.

Younger generations of Emiratis are turning in droves to the mobile phone's free instant messaging service which leaves no trace of conversations on the handset, unlike text messages.
Distinctive features such as built-in solar charging, compasses pointing in the direction of Mecca and reminders of prayer times, also give international phone brands leverage with an underserved local market.

Restricted voice

Despite enthusiasm in the smartphone market, the availability of broadband for domestic consumption is still rather limited with only two providers to choose from.
It costs a home £45 per month for speeds of up to 1mbps, and some of the content on the net has also been restricted.

Dubai has also made it illegal to use voice over internet phone (voip) call services such as Skype which could provide a cheaper alternative to costly international calls.
Raghu Venkatamaran, from telecoms provider Du, said the Skype business model would mean others would benefit at his company's expense.

"They are not investing in fibre, they are not investing in technology to carry calls," he said.
"They are not paying us a single penny for building our networks. We are a young operator and we spend a lot of money building up a nationwide telecom infrastructure".

Blacklisted sites

As a Muslim country, the UAE still has conservative values when compared to some other parts of the world.

Warning on restricted websites
The Dubai authorities have blocked some sites including social networks
The country's authorities have blocked access to some websites, including social networking destinations, but not online news.

However, an attempt to view photo-sharing site Flickr brings up a screen saying it is "content that is prohibited under the Internet Access Management Regulatory Policy" of the UAE.
Alexander McNabb, a Dubai-based tech blogger said: "Panoramio is unblocked and so are other photo sharing services like deviantART I believe because the technology is available to allow some selective blocking of what is pornographic full nudity content. But apparently that can't be done with Flickr".

The country's web filtering works on a blacklist of sites and individual pages with content considered inappropriate for the region.
Web filtering occasionally rejects criticism of Dubai's leadership - for instance if a blog contains a cartoon deemed insulting to the Sheikh.

Technology advances at Central Laser Facility

The Central Laser Facility in the UK houses five laser systems that are regularly accessed by scientists from academia and industry. With a total user community estimated at more than 600 scientists, the CLF provides more than 4400 individual user-access days per year, including 2700 for PhD students.



In the last five years, research conducted at the CLF has produced more than 500 peer-reviewed publications, often in extreme areas of sciences.

The CLF's five major laser facilities enable scientists to study computational plasma physics, experiment with micro-fabrication and laser microscopy, manipulate cells with laser tweezers, and perform other experiments.

Vulcan facility
The Vulcan facility, one of the most intense lasers in the world, has broken records on several occasions for producing the highest optical intensity ever on a target. It is based on versatile Nd:glass chirped-pulse-amplification technology delivering beams into two target areas.

In its petawatt mode, it generates 1 PW (500 J/500 fs) beam with a peak intensity of 1021 Wcm–2. The shots can be fired every 20 minutes. The temporal shape of the long pulse beams can be programmed, and frequency doubling crystals can be used to generate the second harmonic.

The Vulcan facility is used in fusion energy, electron- and ion-acceleration research, laboratory astrophysics and plasma physics research.

 
Vulcan petawatt laser.
Gemini complements Vulcan
The Gemini facility is a high-power laser based on Ti:sapphire chirped-pulse-amplification technology delivering a unique dual beam combination of 2 x 0.5 PW (30 fs, 15 J) on target at a rate of one shot every 20 seconds. It can deliver peak intensities greater than 1021 Wcm–2 with 10–11 amplified spontaneous emission (ASE) contrast.

The capabilities of the Gemini facility balance and complement those of Vulcan.

 
The Gemini facility.
Artemis beam lines
Artemis is a high-repetition-rate Ti:sapphire laser system providing a unique combination of ultrafast and XUV pulses for studies of ultrafast dynamics in atomic, molecular, and condensed- matter systems. Coherent XUV pulses are produced through high harmonic generation (HHG) and are synchronized to pump or probe pulses from the UV to the far infrared.

One XUV beam line provides widely tuneable femtosecond narrowband XUV pulses with near transform-limited energy resolution, enabling time- and angle-resolved photoemission measurements from condensed matter with XUV probe pulses. A second broadband XUV beamline is used for HHG spectroscopy, including measurements of attosecond dynamics in molecules.

 
The Artemis facility.
ULTRA used for dynamic processes
ULTRA is one of world’s most advanced systems for multi-dimensional time-resolved spectroscopy, offering a range of ultrafast vibrational techniques. It operates at 10 kHz with a multi-beam specification, synchronously and independently tuneable from the ultraviolet to the deep infrared, providing fs to ms time-resolved IR, ultrafast 2-dimensional infrared spectroscopy, ps to ms time-resolved 2D IR spectroscopy, and ps time-resolved resonance Raman spectroscopy.

ULTRA can operate on samples in the solution and solid phases and is used for the investigation of dynamic processes in areas ranging from fundamental chemistry to structural biology.

OCTOPUS imaging stations
The OCTOPUS comprises a collection of seven advanced imaging stations driven from a central laser hub with more than 20 specialized laser systems. Any number of these can be simultaneously directed to any of the stations for utmost experimental flexibility.

Advanced image processing algorithms are used to image real systems in real environments in real time, often well below the diffraction limit. Applications are mainly in the biomedical sciences, covering areas such as research into in-cell protein structure, cell signaling networks in health and disease, virus infection, and plant biology.

The end stations include multiphoton confocal, single-photon confocal, single molecule, single-molecule confocal, and super-resolution (PALM and STORM) imaging.

CLF tweezers for healthcare research

The CLF has developed several highly specialized optical traps, or “tweezers,” that are used for holding and manipulating objects such as bacteria, protein crystals, cells, or micron-sized beads.

Their applications include measuring pico-Newton forces within cells either through direct capture or by using laser-controlled nanoprobes in environmental and physical science research.

A new technique for microcrystal manipulation with laser tweezers, for instance, developed by researchers from the Diamond Light Source and the CLF was reported earlier this year and may save precious research days and resources and lead to faster breakthroughs in healthcare.

The novel laser-tweezer process helps to streamline the selection and placement of microscopic protein crystals for crystallographic analysis.

Preparing the crystals can often take a long time, but the new technique allows researchers to select and place the protein crystals on customized sample holders for crystallographic analysis on one of the structural biology beamlines at Diamond Light Source, the UK's national synchrotron science facility.

Studying protein crystals through X-ray crystallography enables researchers to understand the structure and function of a molecule.

Acceleration and ignition experiments
The Vulcan facility has been extensively used to investigate and develop the science underpinning advanced ignition routes towards achieving fusion energy. The achievements include the Fast Ignition concept, as proposed by Ryosuke Kodama and collaborators in 2001.

This technique requires much lower implosion velocities than conventional central hot-spot inertial fusion schemes such as that used at the USA’s National Ignition Facility.

The CLF has also initiated and led the European High Power Laser Energy Research (HiPER) initiative to translate the single-shot proof of principle of inertial confinement fusion into a program of inertial fusion civil energy research, where the process can be repeated many times a second to extract desirable power output at ~1 GW level.

Several breakthroughs in laser-driven plasma- based electron accelerators have also come from CLF, from the first evidence of wave breaking to the acceleration of mono-energetic electrons with energy > 1GeV on Gemini. With the advent of 10 PW lasers, bright mono-energetic electrons with energy >10 GeV promise to become a reality.

Laser-driven proton and ion acceleration experiments are now capable of delivering >50 MeV per nucleon. Work on developing new acceleration mechanisms exhibiting improved beam properties and better energy scaling is currently underway. Potential future applications include proton therapy for cancer treatment.

Intense X-ray sources
The high fields associated with the focused Vulcan and Gemini laser beams have enabled the exploration and evaluation of soft and hard X-ray production physics as sources of secondary radiation. High-harmonic production has been studied on both Vulcan and Gemini using targets ranging from low density gases to solids. A UK team led by Queen’s University of Belfast extended these bright sources into the keV region for the first time using the Vulcan laser.

At high laser intensities and relatively long pulse-duration pulses provided by the Vulcan PW laser facility, scientists reported in 2008 observing that up to 5% of the electron energy in the beam is converted directly into synchrotron X-rays and that the peak brightness of the X-ray beam is 1017 photons/s/mm2/mrad2/0.1% bandwidth witha critical energy of 25 keV, starting to approach the output parameters of second-generation synchrotron sources.

Medical and chemical research
OCTOPUS confocal and super-resolution stations provide information on the interaction of drug and probe molecules with cells and tissues, with applications in medical therapies and diagnostics. Photodynamic therapy is investigated using both ULTRA (chemistry underlying the process) and OCTOPUS (behavior of therapeutic agents in cells and tissues).

OCTOPUS stations are also in use developing techniques to understand, from a systems biology viewpoint, the function of drugs at the molecular level in diseases like cancer, osteoporosis, and neurological disorders. Researchers are also using OCTOPUS to study the structural dynamics of protein folding in Alzheimer’s and motor-neuron disease and to understand how DNA is damaged and repaired.

Researchers at the CLF also invented and support development of Spatially Offset Raman Spectroscopy (SORS), which could probe deep within tissues for noninvasive cancer and bone disease diagnosis.

Companies spin out from CLF
These CLF researchers, along with collaboration with UK industry, government agencies, hospitals, and the defense industry, have played a key role in the formation of about 12 spin out companies.

One company, Cobalt Light Systems, is developing unique products for analyzing chemical materials through translucent layers (bottles, bags, clothes, skin, etc.) using SORS.

 
Cobalt instruments use SORS technology.
Cobalt Light Systems has developed products and technologies for noninvasive screening of bottles for liquid explosives for aviation security, quality control of pharmaceutical products, and scanning of incoming raw materials noninvasively in pharmaceutical manufacturing.

The CLF’s most recent spinout, Scitech Precision, supplies micro-targets for high-power laser experiments developed originally at CLF’s target fabrication facilities.

IR research on chemical reactions and processes
The ULTRA infrared station is used to investigate chemical reactions in solution on the picosecond and femtosecond timescales. A success reported in 2011 was the elucidation of reactions of CN radicals; comparisons with the gas phase revealed subtle, yet important differences due to interactions with the solvent.

A novel technique for mapping energy transfers within molecules, giving an instantaneous snapshot of electron interactions and types of couplings, was developed on Artemis by Ian Mercer. The technique of angle-resolved coherent (ARC) wave-mixing uses broadband light from a hollow fiber source in a new twist on four-wave mixing.

ARC measurements of a photosynthetic protein from purple algae showed peaks due to energy transfer in and between different pigments in the protein. The work confirmed evidence of strong coherent coupling in the protein and revealed the time-ordering of transition energies.

OPCPA developed in UK
The CLF pioneered the technique of Optical Parametric Chirped Pulse Amplification (OPCPA) for the generation of extreme peak powers and intensities in excess of 1 PW. The concept gets around the issue of limited temporal compressibility of typically spectrally narrow high-power laser beams.

This opens a new route to very high peak-power laser systems (>100 PW). The concept was quickly adopted as the pre-amplifier of choice for high-energy petawatt laser systems around the world.

 aerial photo of Central Laser Facility
At CLF, OPCPA is used as the front end for the Vulcan Petawatt beam line and forms the basis of a project to upgrade the Vulcan High Power Laser facility to ~10-20 PW peak power, providing unique access to focused intensities of >1023 Wcm–2.

When completed, this will represent the most powerful and most intense laser in the world. It will be capable of investigating some of the new areas of extreme physics including high-energy density plasma physics promoted to a completely new and uncharted territory where the effects of QED in a plasma environment start to play a significant, if not a dominant, role.

Development of DiPOLE
Development of the next generation of efficient laser sources is key to realizing many of the conceptual applications developed on high-power lasers. Due to the inherent inefficiencies of the existing flashlamp-pumped laser technology, high-energy laser systems (>10 J) are broadly limited to pulse periods of tens of seconds.

In contrast, laser diodes offer high electrical efficiency, a low divergence output giving efficient pump radiation coupling, and an output spectrum which can be tuned to the absorption region of the laser medium. The resulting overall efficiency of such a source is typically greater than 10%, compared to 0.1% for traditional flashlamp-driven systems.

This opens up new opportunities combining high peak power with high average power. In this area, the CLF is developing a system, Diode Pumped Laser for Optical Experiments (DiPOLE) with architecture inherently scalable in single beams to energies beyond 1 kJ at repetition rates above 10 Hz. Such technology promises to open a host of new applications such as security screening of containers, healthcare imaging, and oncology.

How many pages are on the internet?

(CNN) -- It seems like an answerable question, right?
But no one really knows how many websites or individual Web pages make up this seemingly infinite digital universe that is the internet.



Kevin Kelly, a founder of Wired magazine, has written that there are at least a trillion Web pages in existence, which means the internet's collective brain has more neurons than our actual gray matter that's stuffed between our ears.

"The Web holds about a trillion pages. The human brain holds about 100 billion neurons," Kelly writes in his 2010 book "What Technology Wants."
"Each biological neuron sprouts synaptic links to thousands of other neurons, while each Web page on average links to 60 other pages. That adds up to a trillion 'synapses' between the static pages on the Web. The human brain has about 100 times that number of links -- but brains are not doubling in size every few years. The global machine is."

Wild, huh?
Well, at long last, an answer may be coming.
A group called the World Wide Web Foundation -- appropriately founded by Tim Berners-Lee, who pretty much created the internet -- is on a quest to figure out, with some degree certainty, how big the internet really is.

With a $1 million grant from Google, the foundation plans to release the results of its online forensic search, called the World Wide Web Index, early next year, the foundation's CEO, Steve Bratt, said in a recent interview.

Here's how the foundation described the project in an e-mail to CNN:
"The Web Index will be the world's first multi-dimensional measure of the Web and its impact on people and nations. It will cover a large number of developed and developing countries, allowing for comparisons of trends over time and benchmarking performance across countries."
Bratt stressed that it won't answer every question people have about the internet, but he hopes the index, which will be presented as a series of annual reports, will go a long way toward filling in some of the gaps.

"We want to be really careful about what will happen (as a result of the Web Index) because we just don't know," he said. "But this will be probably the best opportunity to quantify" the Web.
So, what kind of tools does one use to try to measure the internet? Certainly not yard sticks and rulers, right?

Bratt said the Web Foundation will conduct surveys of internet users, interview relevant people and try to gather data from internet service providers, national governments and search engines such as Google to come up with its findings.

In addition to looking at how big the Web is, the group wants to use data to tease out the role social media sites had in sparking revolution in the Middle East this year. And it wants to find out what kinds of websites people all over the world are looking at; what websites exist; and how internet trends differ from country to country and region to region.

The International Telecommunications Union digs into some similar questions, publishing reports on the number of internet users in various countries and how fast connections are around the world (South Korea is by far the fastest, in case you were wondering. The United States is super-slow in comparison).

Bratt said the Web Foundation's work will supplement, not replace, what the ITU does.
The foundation is starting work on the Web Index soon and is still seeking funding for the project, he said. The first of five annual reports will be available early next year, the group says.

Google Traffic Dominates the Internet

Like a giant gravity-bending star, Google has grown so massive it is starting to have a measurable effect on Internet traffic flows, an analysis of the company's activities has found.
The blog analysis by Arbor Network's Craig Labovitz follows on from his company's Atlas Observatory Report of last October which offered a fascinating insight into how the Internet is being moulded by a small and decreasing number of super-carriers, with Google at their head.



Arbor has now provided more detail on the astounding explosion of Google's Internet presence, which as of last summer it estimates as accounting for peak rates of 10 percent of all Internet inter-domain Internet traffic it sees travelling through its servers.

Between June 2007 and a year later, the average traffic percentage grew from around 1 percent to around 2.5 percent; by last summer the percentage was a minimum of 5 percent and growing.

The main reason was Google's acquisition of YouTube in 2007, which consumes huge volumes of video traffic, the application that almost on its own is driving capacity growth at the peer network level.

As significant as their sheer number is how all these Google-related packets move across the Internet. In mid-2007, Google used third-party "transit" (i.e other networks) for a large percentage of its Internet traffic. By this February, Arbor reckons that over 60 percent of Google's traffic was being channelled through direct interconnects that link its massive data centres to one another.

To put this in less technical terms, Google and the customers using its services are not so much using the Internet as Google's own private corner of it, a peered network wit

google searchhin a wider Internetwork.
Arbor's Labovitz reminds us that Google has apparently spent the last year installing Google Global Cache Servers (GGCs) in as many as half of all third-party consumer networks in the US and Europe, which extends the edge of its network into even more data centres.

"Unlike most global carriers, Google's backbone does not deliver traffic on behalf of millions of subscribers nor thousands of regional networks and large enterprises. Google's infrastructure supports, well, only Google," comments Labovitz.

Famous for its search, e-mail and YouTube video sharing, Google has quietly and relentlessly turned itself into the first super-carrier of the Internet era.

Fast Internet access becomes a legal right in Finland

(CNN) -- Finland has become the first country in the world to declare broadband Internet access a legal right.



The move by Finland is aimed at bringing Web access to rural areas, where access has been limited.
The move by Finland is aimed at bringing Web access to rural areas, where access has been limited.

Starting in July, telecommunication companies in the northern European nation will be required to provide all 5.2 million citizens with Internet connection that runs at speeds of at least 1 megabit per second.

The one-megabit mandate, however, is simply an intermediary step, said Laura Vilkkonen, the legislative counselor for the Ministry of Transport and Communications.

The country is aiming for speeds that are 100 times faster -- 100 megabit per second -- for all by 2015.

"We think it's something you cannot live without in modern society. Like banking services or water or electricity, you need Internet connection," Vilkkonen said.

Finland is one of the most wired in the world; about 95 percent of the population have some sort of Internet access, she said. But the law is designed to bring the Web to rural areas, where geographic challenges have limited access until now.

"Universal service is every citizen's subjective right," Vilkkonen said.

Should fast Internet access be everyone's legal right?

It is a view shared by the United Nations, which is making a big push to deem Internet access a human right.

In June, France's highest court declared such access a human right. But Finland goes a step further by legally mandating speed.

On the other hand, the United States is the only industrialized nation without a national policy to promote high-speed broadband, according to a study released in August by the Communications Workers of America, the country's largest media union.

Forty-six percent of rural households do not subscribe to broadband, and usage varies based on income, the study found.

In February, the U.S. Federal Communications Commission is expected to submit a national plan to Congress. The FCC says that expanding service will require subsidies and investment of as much as $350 billion -- much higher than the $7.2 billion President Barack Obama's economic stimulus package has set aside for the task.

3G network brings internet to Mount Everest climbers

Mount Everest climbers can now surf the internet and make video calls through a 3G network, Nepalese telecoms firm Ncell says.




The company has installed eight 3G base stations along the route to Everest base camp.
The wireless network could help thousands of tourists who visit Mount Everest every year, Ncell claims.

Climbers and trekkers in the Everest region have so far relied on satellite phones and a voice-only mobile network.
Ncell, which is owned by the Swedish company TeliaSonera, says its highest 3G base station is near Everest base camp at 5,200 metres (17,000 ft).
The coverage would reach the summit of the world's highest mountain, company head Pasi Koistinen, said.

He added that this had not been tested yet.
The 3G network will help climbers and trekkers stay in touch with their families and trip organisers, Mr Koistinen said.

It will also enable them to receive weather reports and safety information while they are climbing.
Around 3,000 people have climbed to the Everest summit since Edmund Hillary first conquered the peak in 1953 and used runners to carry messages from his expedition to the nearest telegraph office.
Less than one third of Nepal's population have access to telecommunication services.
TeliaSonera announced that it would invest more than $100m (£63m) in the next year to increase mobile coverage in the country.

Bill Gates to Mark Zuckerberg: Internet Will Not Save the World

Mark Zuckerberg's claim that getting the world online will be "one of the most important things we do in our lifetime" is a joke according to Microsoft co-founder and chairman Bill Gates.




e Microsoft co-founder said he "certainly loves the IT thing," but has reservations about Silicon Valley's desire to help the world by bringing the internet to some of its poorest countries. "When we want to improve lives, you've got to deal with more basic things like child survival, child nutrition," the 58-year-old explained in an interview with the Financial Times.

In August, Zuckerberg, along with Samsung, Nokia, Ericsson, Qualcomm and other technology companies announced Internet.org and its goal of bringing the internet to the five billion people not yet online. The Facebook founder described this aim as "one of the most important things we will do in our lifetime."

But Gates disagrees. When asked if giving the whole world an internet connection is more important than his foundation's goal of curing malaria, Gates said: "As a priority? It's a joke."

The world's second-richest man and Harvard dropout continued, sarcastically: "Take this malaria vaccine, [this] weird thing that I'm thinking of. Hmm, which is more important, connectivity or malaria vaccine? If you think connectivity is the key thing, that's great. I don't."

Referring to technology more widely, Gates adds: "The world is not flat and PCs are not, in the hierarchy of human needs, in the first five rungs."

Richard Waters of the FT states that, after the interview, Gates' "minders" called to try and persuade the reporter to omit the technology icon's comments on Zuckerberg. "As a senior statesman of the tech and philanthropic worlds, it doesn't help these days to pick fights," Waters observes.

Eradicating polio

Run with his wife, the Bill & Melinda Gates Foundation made polio eradication a priority in 2008, after seeing global efforts to wipe out the disease had gone off course, as a decade of progress had failed to reach the global coverage and momentum needed to push for extinction.

Gates says the organisations involved in these efforts had "sort of naively assumed it was on track, but it wasn't. The idea that business as usual was going to get us there - it has to be broken out of that [way of thinking], because it wasn't going to succeed.

"It probably would have been better to just give up than do business as usual. But that would have been horrific."

Now his foundation has stepped in to help beat polio, Gates says partly removing the disease is no good and only complete eradication, as has been accomplished in India, will be enough.

"Eradications are special. Zero is a magic number. You either do what it takes to get to zero and you're glad you did it; or you get close, give up and it goes back to where it was before, in which case you wasted all that credibility, activity, money that could have been applied to other things."

From Microsoft to Polio

After stepping down as CEO of Microsoft in 2000, Gates to this day retains the position of chairman and will play a key role in securing a replacement to current chief executive Steve Ballmer. Gates dedicates one day a week to the company, holds regular meetings with some of the company's product groups, and expects to spend time working with Ballmer's replacement.

In switching from world-beating computer giant to disease-beating philanthropist, Gates has had to change his approach to business and management. "I was kind of a hyper-intensive person in my twenties and very impatient. I don't think I've given up either of [those] things entirely. Hopefully it's more measured, in a way."

Explaining how he has changed, Gates recalls a meeting when his foundation's work with polio was suffering. Gates told his workers: "'Hey, this is not good thinking, this is not good, this is not going to get us there'."

Once he had calmed down, Gates spoke to his wife Melinda: "I said...was I too tough on that, who should I send mail to, was that motivational, de-motivational? It's all a matter of degree."

World's first 'tax' on Microsoft's Internet Explorer 7

The Australian online retailer Kogan.com has introduced the world's first "tax" on Microsoft's Internet Explorer 7 (IE7) browser.

Customers who use IE7 will have to pay an extra surcharge on online purchases made through the firm's site.

Chief executive Ruslan Kogan told the BBC he wanted to recoup the time and costs involved in "rendering the website into a antique browser".
The charge is set to 6.8% - 0.1% for every month since the IE7 launch.
Shows the 6.% charge added on to price.
Image caption
Every month the surcharge will rise by 0.1%.

Too much effort

According to Mr Kogan the idea was born when the company started working on a site relaunch.
Mr Kogan said that even though only 3% of his customers used the old version of the browser, his IT team had become pre-occupied with making adaptations to make pages display properly on IE 7.
"I was constantly on the line to my web team. The amount of work and effort involved in making our website look normal on IE7 equalled the combined time of designing for Chrome, Safari and Firefox."

Mr Kogan said it was unlikely that anyone would actually pay the charges. His goal is to encourage users to download a more up-to-date version of Internet Explorer or a different browser.
Mr Kogan told the BBC his customers were very happy and he had received a lot of praise for his efforts.

"Love your IE7 tax. I hope it becomes effective" was one of the messages posted to Kogan on Twitter.
IE7 was launched in 2006, but since then Microsoft has released two major updates to the software.
The launch of Internet Explorer 10 is due in the autumn.

Google Fiber: Pros and Cons

Everyone, and I mean absolutely everyone, wants Google Fiber. And who wouldn’t?


Its service, with 1000 Mbps download and upload speeds, is 100 times faster than the Internet connection that most people have today. That means no more buffering videos, cloud gaming that doesn’t slow down the entire house, and the genesis of HD videoconferencing for the average Joe.

And it’s cheap. The lucky citizens of Kansas City—the first U.S. city to get it from Google—will only pay $120 or $70 for their gig of Internet access. And for those Midwesterners who are happy with a regular old 5Mbps connection, they can get it for free for seven years after paying a $300 construction fee.

[See more: Google's Lightning-fast Fiber Network Now Live in Kansas City]

Google Fiber has so much beauty, people are literally begging Google to bring it to their cities. Yet fiber doesn’t come without caveats. Here’s a look at some pros and cons.

Pro: ISPs Will Have to React
Competition is good for consumers and it means the cable companies are going to have to switch things up, or die.

For $100 a month, Comcast’s premium Internet connection offers downloads up to 50 Mbps and uploads up to 10 Mbps and that’s with no TV whatsoever. ISPs are going to have to get faster and cheaper, which is good news for consumers.

Pro: No Bandwidth Caps
To its credit, Comcast, which is the leading ISP in the U.S., doesn’t count streaming from its own streaming video service against caps. But check this: Google doesn’t have any caps at all and even includes Netflix in its service when it could have given preferential treatment to its own YouTube and Fiber TV products.

This is huge, especially for people who live in places where cable hasn’t yet arrived (still true in many rural areas) and can only get high speed Internet access through a cellular carrier. In such cases, most of these folks can forget about streaming anything, unless the exorbitant cost of blasting through data caps doesn’t bother them.

Pro: A Super Cool Remote and Option for a Chromebook
The Google Nexus 7 tablet is so hot, you can’t even find one in a store now—at least the 16GB version. But not only is Kansas City the envy of every other metropolis in the U.S., people there who ante up for the Google Fiber with TV get a free Nexus 7 that they can use as a remote, or for consuming media all on its own. While there’s no word on if the free tablet is the 8GB model or 16GB version, does it really matter?


The Google Fiber site also shows the Chromebook as an option for $299. Even though some people have criticized the web-centric laptop, Google recently announced some important improvements to the device. And let’s be honest—you’re not going to find a laptop much cheaper.

Pro: Storage Galore
The DVR box that comes with Google Fiber TV lets users record up to eight shows at one time. Its 2TB hard drive also means you can store as much as 500 hours of HD video. That’s a lot of episodes of “30 Rock” you can have at your beck and call.

And people who opt for the Internet-only version get 1TB of cloud storage on Google Drive. That means a user’s every digital asset can be accessible from any Internet-connected device or computer whenever he or she needs it.

Con: No ESPN
Sports fanatics won't like this. A lot of people can probably live without some of the channels not currently included in Google Fiber TV, such as Disney, AMC, TBS, TNT, and HBO; but the exclusion of ESPN might be a deal killer for some.


Apparently Time Warner’s channel is too rich for Google’s taste. “ESPN charges cable providers around five times more than the average network,” reports Time.

The good news is Google appears to be negotiating with some networks so as to get a more complete offering. By the time Fiber goes live, maybe some of these big guns will be on board.

Con: Don’t Expect it Anytime Soon
Speaking of the rollout of Fiber in Kansas City, it’s not going to happen overnight. After September 9, Google will announce a calendar that shows which neighborhoods will get it when.

The neighborhoods toward the top—based upon which ones have demonstrated their longing for Fiber the most by preregistering the most homes—likely won’t be flying around on the Internet like lightning until mid-2013. As for the rest of the world . . . well, there’s no telling.

Con: Now Google Will Really Know Everything
Google already knows a lot about its users—where we go online, what kinds of things we buy, where we take our Android phones. Some people don’t have a problem with this, and—like Google—feel it helps the search and advertising giant better serve them.

But Google’s track record on privacy is far from spotless. Some people aren’t going to be comfortable with Google also having their TV and movie consumption data, on top of everything else. If that concerns you, check out the Google Fiber Privacy Notice.

Con: For Now, A Gigabit Connection Can be Underwhelming
Fiber users with less-than-stellar equipment or those trying to communicate with others on the Internet not blessed with a great connection aren’t going to hugely impressed with what they’ll get. If you’ve ever tried to Skype with someone working with either of these scenarios, you know what I mean.


As GigaOm points out, “the Internet is reciprocal so it’s no fun if you have the speeds to send a holographic image of yourself but no one on the other end can receive it.” In fact, “underwhelming” describes the experience in Chattanooga, Tennessee, where for the past two years the public utility has offered customers a gigabit fiber-to-the-home connection for about $300 a month.

Still, considering the alternatives—comparably slow and expensive offerings from cable companies—Kansas City getting Google Fiber is akin to the city winning a digital lottery.

The World Wants to Break Up with America's Internet

Thanks to this year's NSA revelations, the world wants to break up with the United States’ internet. The only problem? It's not sure how to.

Last week, global Internet organizations met in Uruguay—which so happens to be leading South America in Internet penetration—and pledged to untangle themselves from North American influence. While they didn’t come right out and explicitly name Snowden and his leaks about the NSA’s surveillance program, or even the United states, their released statement “expressed strong concern over the undermining of the trust and confidence of Internet users globally due to recent revelations of pervasive monitoring and surveillance.”

Naturally, that's referring to Snowden's leaks about NSA activities. Now, privacy concerns are already leading to a shift in the way the interenet operates. The US has traditionally been the world's leader in internet infrastructure, partly because it's been the center of innovation, and partly because, as TechCrunch wrote, our free speech laws “are perhaps the most ironclad of any."

It's become apparent that despite our free speech protections, the US internet is both heavily controlled—see the ongoing copyright battles—and surveilled. Domestically, those revelations are already having an impact. At least three secure email services have already shut down, and the eventual blowback to US tech companies in terms of lost business has been pegged at tens of billions of dollars.

Now the chilling effects of the NSA's broad spying activities are spreading internationally. Directors of all major Internet organizations were present at the Uruguay meeting, including the Internet Corporation for Assigned Names and Numbers (ICANN), the Internet Engineering Task Force, the Internet Architecture Board, the World Wide Web Consortium (W3C), and the Internet Society. Another portion of their released statement asked to accelerate “the globalization of ICANN and IANA functions, towards an environment in which all stakeholders, including all governments, participate on an equal footing.”

Note that "equal footing" doesn't exactly mean the US is getting cut out of the web, which would be pretty much impossible. ICANN and IANA are both US-based organizations that were originally created to do Internet-related tasks for the government, like create and distribute IP addresses. The United States created the Internet, more or less owns it, and has expressed no interest in giving up that control. The only way to break up with the American Internet right now is to completely disconnect from it—and no one wants to do that.

But according to Internet Governance, a day after the meeting, the ICANN President and CEO Fadi Chehadi met with Brazilian President Dilma Rousseff, asking her to "elevate her leadership to a new level, to ensure that we can all get together around a new model of governance in which all are equal.”  Keep in mind here that Chehadi was appointed by the US government, who has remained silent on this matter.

It is no coincidence Brazil will be hosting the next Internet Governance Summit; President Rousseff has been very vocal in criticizing the US’s surveillance program and has expressed interest in building their own Internet.

Can it be done, though? Building an internet without the US, aka data centers and undersea cables, would take a massive amount of time and resources and as Motherboard’s Meghan Neal pointed out, it will also require users to stop using sites associated with Google (including YouTube), Apple and Facebook. That isn’t to say no country is trying; Germany wants to keep all Internet traffic on local servers so they can’t be spied on by the NSA, but that’s not really a “German internet” free from the US.

Without a feasible alternative on the horizon the released statement by the Internet's core institutions is just a statement of intent, a protest song and dance indicative of a messy break-up that could last for years and years.

Canada Falls On Internet Speed Rankings

Canada has some of the slowest internet speeds in the developed world.

According to data from broadband research company Ookla, Canadians on average had the 38th fastest internet speeds in the world. Among developed countries, only a handful had slower internet speeds than Canada, among them Australia, New Zealand and Italy.

That represents a fall of five spots since this spring, when Canada ranked 33rd in Ookla’s survey.

Ookla crowdsources data from SpeedTest.net, a site that allows web users to test the speed of their connections. The research firm used millions of these tests to compile its data on internet speeds around the world. The numbers represent an aggregate of actual internet speeds, not speeds as advertised by providers.

The average internet speed in Canada was 18.94 Mbps over the past six months, up from 16.6 Mbps this spring. But despite the apparent improvement, Canada slipped on the rankings as other countries saw larger increases in internet speed.

Canadians' average is speed is less than a third of world-leading Hong Kong, at 70.91 Mbps.

How a country ranks on the listings depends not only on the quality of the internet infrastructure, but also on affordability: If internet services are too expensive, consumers will opt for lower-speed services, and average speeds will be lower.

Canada ranks 30th on Ookla’s list of lowest internet costs. At $3.61 per Mbps on average, Canada is slightly cheaper than the U.S. ($3.82) and slightly more expensive than Sweden ($3.55).

The CRTC’s latest survey of broadband services finds broadband is now available to 99 per cent of Canadian households, including 95 per cent of households in rural areas.

The availability of ultra-high-speed internet (100 Mbps+) has increased considerably, with about a third of Canadian households having the option to order the service. That’s up from only around 10 per cent in 2009.

The number of households with broadband grew to 13.5 million in 2012, up from 13 million in 2010, the CRTC reported.


Mobile Internet to Dominate Within 5 Years -- Study

The mobile Internet is growing faster than its desktop counterpart ever did, and more users may go online via mobile devices than desktop PCs within five years, according to a new study by investment firm Morgan Stanley.

The intriguing prediction is one of many in the firm's massive "The Mobile Internet Report," a 424-page epic that someone, somewhere is bound to read in its entirety. For the rest of us, the executive summary will do just fine. If you're interested in perusing the full report, you'll find it here.
The report states we're "now in the early innings" of mobile Internet development, which is growing faster than previous tech cycles, including the evolution of the desktop PC. Given the rapid adoption of smartphones, including (obviously) the Apple iPhone and a growing number of devices using Google's Android mobile operating system, Morgan Stanley's conclusions shouldn't surprise anyone.

The study also points out that mobile Net growth is global phenomenon, not one confined to the developed world, which was typically the case with prior tech trends. But despite the worldwide focus, U.S. companies including Apple, Google, and Amazon are taking a leadership role. Furthermore, "a host of relatively young, but seasoned world-class technology veterans," including Apple CEO Steve Jobs and Facebook's Mark Zuckerberg, are leading the mobile push, the report states.

Five key tech trends are converging to spur mobile Net growth, including 3G (and soon 4G) broadband, the popularity of social networking, online video, VOIP services such as Skype and Vonage, and "awesome mobile devices" that do tasks that until recently were the sole domain of your desktop or laptop PC.

The short term looks especially bright for Apple, but challenges await.

The "mobile ecosystem" of the iPhone, iPod touch, iTunes, and various accessories and services will continue to bloom over the next two years. After that, however, Google Android, competition from emerging markets, and wireless carrier limitations may pose a threat to Apple's market share, the report predicts.

There's little doubt the mobile Internet will dominate in the coming years--just look how far mobile handsets have come since the debut of the iPhone in 2007. Toss in a growing selection of rapidly improving smartphones, a new breed of wireless-ready tablet devices, e-readers like the Amazon Kindle, and faster 4G networks, and it's easy to see that mobile is the future of the Net.

In New Form of Censorship, Iran Moves to Disconnect Its Internet From World

Iran is taking steps toward an aggressive new form of censorship: a so-called national Internet that could, in effect, disconnect Iranian cyberspace from the rest of the world.

The leadership in Iran sees the project as a way to end the fight for control of the Internet, according to observers of Iranian policy inside and outside the country. Iran, already among the most sophisticated nations in online censoring, also promotes its national Internet as a cost-saving measure for consumers and as a way to uphold Islamic moral codes.

In February, as pro-democracy protests spread rapidly across the Middle East and North Africa, Reza Bagheri Asl, director of the telecommunication ministry's research institute, told an Iranian news agency that soon 60% of the nation's homes and businesses would be on the new, internal network. Within two years it would extend to the entire country, he said.

The unusual initiative appears part of a broader effort to confront what the regime now considers a major threat: an online invasion of Western ideas, culture and influence, primarily originating from the U.S. In recent speeches, Iran's Supreme Leader Ayatollah Ali Khamenei and other top officials have called this emerging conflict the "soft war."

On Friday, new reports emerged in the local press that Iran also intends to roll out its own computer operating system in coming months to replace Microsoft Corp.'s Windows. The development, which couldn't be independently confirmed, was attributed to Reza Taghipour, Iran's communication minister.

Iran's national Internet will be "a genuinely halal network, aimed at Muslims on an ethical and moral level," Ali Aghamohammadi, Iran's head of economic affairs, said recently according to a state-run news service. Halal means compliant with Islamic law.

Mr. Aghamohammadi said the new network would at first operate in parallel to the normal Internet—banks, government ministries and large companies would continue to have access to the regular Internet. Eventually, he said, the national network could replace the global Internet in Iran, as well as in other Muslim countries.

The Internet now has 340 trillion trillion trillion addresses

NEW YORK (CNNMoney) -- One of the crucial mechanisms powering the Internet got a giant, years-in-the-making overhaul on Wednesday.

When we say "giant," we're not kidding. Silly-sounding huge number alert: The Internet's address book grew from "just" 4.3 billion unique addresses to 340 undecillion (that's 340 trillion trillion trillion). That's a growth factor of 79 octillion (billion billion billion).
If it all goes right, you won't notice a thing. And that's the point.

The Internet is running out of addresses, and if nothing were done, you certainly would notice. New devices simply wouldn't be able to connect.

To prevent that from happening, the Internet Society, a global standards-setting organization with headquarters in Geneva, Switzerland; and Reston, Va., has been working for years to launch a new Internet Protocol (IP) standard called IPv6.

IP is a global communications standard used for linking connected devices together. Every networked device -- your PC, smartphone, laptop, tablet and other gizmos -- needs a unique IP address.
With IPv6, there are now enough IP combinations for everyone in the world to have a billion billion IP addresses for every second of their life.

That sounds unimaginably vast, but it's necessary, because the number of connected devices is exploding. By 2016, Cisco (CSCO, Fortune 500) predicts there will be three networked devices per person on earth. We're not just talking about your smartphone and tablet; your washing machine, wristwatch and car will be connected too. Each of those connected things needs an IP address.
Then there's all the items that won't necessarily connect to the Internet themselves, but will be communicating with other wired gadgets. Developers are putting chips into eyeglasses, clothes and pill bottles. Each one of those items needs an IP address as well.

The current IP standard, IPv4, was structured like this: xxx.xxx.xxx.xxx, with each "xxx" able to go from 0 to 255. IPv6 expands that so each "x" can be a 0 through 9 or "a" through "f," and it's structured like this: xxxx:xxxx:xxxx:xxxx:xxxx:xxxx:xxxx:xxxx. (Yes, there was an IPv5, but it was a streaming multimedia standard developed in the late 1970s that never really caught on).
The changeover is akin to when the U.S. telephone system handled soaring growth by increasing the digits in each telephone number -- except for one crucial difference. While the entire telephone system was upgraded in the 1990s, the Internet will be upgraded gradually.
IPv4 will continue to exist alongside IPv6 for quite some time, just as digital and analog TV were broadcast side-by-side for years.

Though most of the major Internet players will be IPv6 compliant going forward, many routers, devices and operating systems won't be. For instance, Microsoft (MSFT, Fortune 500) Windows XP, the world's most-used PC operating system, is not IPv6-compliant.

Just 1% of end users are expected to now be reaching websites using the IPv6 standard. The Internet Society expects that to gradually grow as users update their software and hardware.
Most of the major websites and networks are already participating. More than 2,000 websites, including Google (GOOG, Fortune 500), Facebook (FB), Bing, Yahoo (YHOO, Fortune 500), AOL (AOL) and Netflix (NFLX), as well as a number of network operators such as AT&T (T, Fortune 500), Verizon (VZ, Fortune 500), Comcast (CMCSA) and Time Warner Cable (TWC, Fortune 500), have begun enabling IPv6.

But they'll all need to continue to support IPv4 until the entire world upgrades. That will take years.
There have been some grumblings about cyberattackers getting ready to pounce on Wednesday, taking advantage of potential holes in a new technology. But a year ago, on June 8, 2011, all those participating networks and sites turned on IPv6 for a day-long test run without a hitch.
They reverted to IPv4 the next day. This time, the change is permanent. It'll be a slow transition, but it's a crucial one that will support the Internet's current rate of expansion far into the future.

Fake internet cafes and keyloggers: British intelligence reportedly spied on major world leaders during 2009 G20 summit

 Share on Facebook (295)  Tweet  Share (17)  Pin
Using tactics that included luring diplomats into fake internet cafes, The Guardian reports that British intelligence spied on major world leaders during the 2009 G20 summit in London. The revelation is based on documents provided by whistleblower Edward Snowden, who fueled earlier leaks about the Government Communications Headquarters (GCHQ) and the US National Security Agency (NSA) through The Guardian and The Washington Post.

The G20 summit in London included President Obama as well as 20 other heads of state and governing bodies. During the summit, the GCHQ reportedly monitored the foreign politicians' computers and phone calls, and had direct permission to do so from high-level officials in then-PM Gordon Brown's administration.

The intent of the alleged spying was to gain an edge in negotiations against other countries, including Turkey and South Africa, according to The Guardian. The GCHQ managed to tap into phones and computers by establishing internet cafes with built-in key logging and email intercepting software, as well as by hacking delegates' BlackBerrys to monitor messages and phone calls. The British intelligence agency was apparently able to read the attendees' emails even before the attendees themselves accessed them.

The key logging reportedly may also have provided the GCHQ with online login details — such as usernames and passwords — that were used by foreign leaders. The NSA, which shares information with the GCHQ, was allegedly gathering information during the summit as well, attempting to intercept and decrypt phone calls made by then-President Dmitry Medvedev. The GCHQ is said to be involved with the US PRISM program as well.


By the time the G20 delegates' financial leaders met in London five months later for a separate meeting, the GCHQ had apparently improved the surveillance technology enough to create a live map of telephone activity, which it projected onto a large wall in one of its offices, reports The Guardian. The leaked documents note that the effort was "very successful" in allowing them to see delegates' activity. This program is said to have only run for six months, though it's unclear if a newer technology has replaced it. Britain will be hosting heads of state once again for the G8 summit tomorrow.

Update: BlackBerry tells us that there is no secret backdoor into its devices and that it stands by the security of its software. The company's full statement is below.

Is Your ISP Throttling Your Internet Connection?

Think your Internet Service Provider (ISP) is messing with your connection performance? Now you can find out, with Google's new online tools that will diagnose your network connection. Here's a quick walkthrough on how to make the best of them.
Google's broadband test tools are located at Measurementlab.net. On that page, you'll see an first icon that says "Users: Test Your Internet Connection". Click that, and then you'll be taken to a page where there are three tests available, and two more listed as coming soon. However, out of the three available tests, only one of them is fully automated and easy to use.

Glasnost , second on the list, will check whether your ISP is slowing down (like Comcast) or blocking Peer2Peer (P2P) downloads from software such as BitTorrent. P2P apps are commonly used for downloading illegal software and media content like movies and music, but also are used for legal purposes as well, such as distributing large software packages to many users at once.

To use the measurement tool, you will be redirected to the Glasnost site. You'll need the latest version of Java installed, and you should stop any large downloads that you may have running before you begin the test. If you're on a Mac, a popup message will prompt you to trust the site's Java applet.

When you're ready to start, you can choose whether you want to run a full test (approximately 7 minutes long) or a simple test (4 minutes long). When I tried to test my connection, Glasnost's measurement servers were overloaded and an alternative server was offered, but that was overloaded as well. After a short while I was able to run the test.

In the tests of my connection (my provider is Vodafone At Home, in the UK) all results indicated that BitTorrent traffic is not blocked or throttled. But I'm looking forward to hearing from you in the comments how your ISP performed in Glasnost's diagnostics. Meanwhile, make sure you keep an eye on the other tests that will be available soon from Measurementlab.net.

Brazil's ban on U.S. Internet services may prove futile

Brazil's government is considering installing new hardware locally to reduce the country's dependence on U.S. services for Internet access. The move comes in response to reports that the U.S. government had intercepted emails and phone calls of Brazilian citizens, its state-run oil company and the country's president, Dilma Rouseff.

ted talk
Four mindblowing Ted Talks for techies
TED talks make that possible to do in a single sitting. Here are four talks that in just over an hour
READ NOW
In order to bypass the U.S., Brazil is considering several steps, including opening local data centers that would be subjected to the country’s privacy laws, removing sensitive data from the cloud and storing it locally, and potentially creating a BRICS cable connecting to the eastern Russian city of Vladivostok through a series of cables running through South Africa and Asia.

In addition, President Rousseff is pursuing legislation that would require major Internet companies, including Google and Facebook, to store all data gathered in Brazil in the country’s local data centers.

The efforts follow President Rousseff’s decision to postpone a scheduled trip to visit the U.S. this week and demand an apology from U.S. President Barack Obama after evidence of NSA spying in Brazil appeared in documents leaked by NSA whistleblower Edward Snowden earlier this month. The documents reportedly indicated that the NSA had spied on Brazil’s state-run oil company, Petrobas, as well as the office of President Rousseff. Others were named in the reports, including Google and Mexico’s new president Enrique Pena Nieto.

The efforts at Internet independence aren’t entirely new in the country. A bill introduced in 2012 called the “Internet Constitution” would require social media companies to delete Brazilians’ data once their profiles had been closed, and attempt to lay out a framework for Internet use among the country’s citizens, according to Reuters. President Rousseff reportedly asked that the plan to open new local data centers be incorporated into the bill.

Leonardo Santos, a spokesman for Brazilian legislator Alessandro Molon, who has pushed the Internet Constitution since it was first written, told Reuters that the proposed changes may be “difficult,” but are not impossible. Others doubted that Brazil would be able to contain all data created in Brazil, simply because of the global nature of the Internet, which allows Brazilian citizens to exchange data with those in other countries.

Meanwhile, others doubt that the efforts will actually prevent the NSA from accessing Brazilian data, considering that the agency is believed to have backdoors into major U.S. Internet services.

"They are a step towards getting out [of] the very strong control the U.S. has over the Internet infrastructure," Dr Joss Wright, a cybersecurity expert at the University of Oxford's Internet Institute, told the BBC. "But if you send an email from your encrypted Brazilian provider to somebody else who has a Gmail account then Google is getting to read the thread of information anyway.”

As for the BRICS cable, Dr. Wright added that the Brazilian government “can't guarantee that just because there is a new high-capacity cable running from Brazil to Russia that all the data will go through it rather than an alternative."

Making matters more difficult is Brazil’s current Internet infrastructure. In a 2012 report that ranked 30 countries by the risk in their data center operations, Brazil ranked 30th. The report, which was composed by real estate firm Cushman & Wakefield and engineering consultancy hurleypalmerflatt, cited the country’s high costs of electricity and poor education levels as the main factors dragging down the country’s data center quality, according to Reuters.

Aaron Swartz, internet freedom activist, dies aged 26

Aaron Swartz, a celebrated internet freedom activist and early developer of the website Reddit, has died at 26.
The activist and programmer took his life in his New York apartment, a relative and the state medical examiner said. His body was found on Friday.

Mr Swartz began computer programming as a child, and at 14 co-authored an early version of the RSS specification.

Leading internet figures and friends paid tribute to Mr Swartz via tweets or blogs.
After leaving Reddit, Mr Swartz became an advocate of internet freedom, and was facing hacking charges at the time of his death.
He was among the founders of the Demand Progress campaign group, which lobbies against internet censorship.

The hacking charges relate to the downloading of millions of academic papers from online archive JSTOR, which prosecutors say he intended to distribute for free.
He denied charges of computer fraud at an initial hearing last year, but his federal trial was due to begin next month.
Mr Swartz's lawyer Elliot R. Peters confirmed the news of his client's death in an email to the MIT university newspaper The Tech.

"The tragic and heartbreaking information you received is, regrettably, true," he wrote.
A spokeswoman for New York's medical examiner later confirmed to Associated Press news agency that Mr Swartz had hanged himself.

In a statement later on Saturday, Mr Swartz's family praised his "brilliance" and "profound" commitment to social justice and also expressed bitterness toward the prosecutors pursuing the case against him.

"Aaron's death is not simply a personal tragedy. It is the product of a criminal justice system rife with intimidation and prosecutorial overreach," the statement said.
Sir Tim Berners-Lee - the British inventor of the world wide web - commemorated Mr Swartz in a Twitter post: "Aaron dead. World wanderers, we have lost a wise elder. Hackers for right, we are one down. Parents all, we have lost a child. Let us weep."

Google, Facebook rule Age of Internet Empires

A new map shows how Google and Facebook are the rulers of the internet world and have become the most visited sites in more than 60 countries around the world.

Google rules the world... at least the internet world when it comes to the most visited site worldwide. So much so that Google has become a part of everyday life for many people. It is so popular that the brand name "Google" has now become a verb.

A new map illustrating the most visited websites in each country shows how far and wide Google's users are spread. The second most visited site was Facebook.

The map, created by researchers at the Oxford Internet Institute (OII) , used data collected by Alexa on August 12, 2013.

While Google and Facebook reign supreme over any other site on the web, what is interesting is the geographical continuity of these two internet "empires".

"Google is the most visited website in most of Europe, North America, and Oceania. Facebook, in contrast, is the most visited website in most of the Middle East and North Africa, as well as much of the Spanish-speaking Americas," says the finding of the OII.

The creators of the map presented their findings as a choropleth map with colours indicating each country's most visited site. The most obvious ones: Google (Red colour) and Facebook (Blue) are presented in an old-style colonial map. Interestingly, it has been named after a computer game series titled "Age of Empires".

The creators also made a second map presenting the same data in hexagonal cartogram (see bottom).

NO GOOGLE IN CHINA
Interestingly, while Google reigns all over the world, the only region where it hasn't been able to make any headway or get a toehold is Asia. In China, Baidu (marked in green) rules the roost. Baidu, curiously, is also the most visited website in South Korea, ahead of the country's own search engine, Naver.

In Russia, search engine Yandex tops the list while Yahoo! Japan is the preferred search engine in Japan.


Google's Excellent Plan To Bring Wireless Internet To Developing Countries

The WSJ is reporting that Google GOOG -0.03% is working on various technologies to bring wireless internet access to a number of developing countries. This is an absolutely excellent plan and one that we should be applauding. But not necessarily for what it will do to Google itself: rather, for what it will do to those developing countries:

Google Inc. is deep into a multipronged effort to build and help run wireless networks in emerging markets as part of a plan to connect a billion or more new people to the Internet.

These wireless networks would serve areas such as sub-Saharan Africa and Southeast Asia to dwellers outside of major cities where wired Internet connections aren’t available, said people familiar with the strategy.

The networks also could be used to improve Internet speeds in urban centers, these people said.

They’re looking at all sorts of different technologies to achieve this: satellites through blimps to microcells that broadcast a 3G signal perhaps half a mile. There will be a certain amount of mix and match depending upon the precise circumstances of the area to be covered.

The thing I want to concentrate on though is the economic effect upon the countries where such networks are installed. We know from our economic history that the roll out of the telephone network aided in the development of the currently advanced countries. We’ve also had more recent studies looking at the effect of mobile phone networks on countries that don’t have a landline network. The effects are remarkably large. An extra 10% of the population with a mobile leads to an extra 0.5% growth in GDP year by year. That’s a seriously large effect from the addition of just one technology: presumably because being able to communicate is what makes nearly all other technological adaptations adoptable.

We’ve also been seeing the beginnings of reports from various of the telecoms consultancies telling us how the expansion of broadband, then mobile broadband (or smartphones, to taste) has been leading to further growth in the societies that have adopted them. Given that we can see that communications networks have had this effect before we shouldn’t be surprised that the latest one is again having such an effect. Although it would be fair to say that the studies about broadband are still pretty sketchy: we’ve not got a great deal of accurate evidence as yet.

So, we would expect that the addition of mobile broadband in a society that doesn’t have landline broadband to be a useful thing. Leaping over one technology in order to get that comms network up and running. Usually when I mention this I get comments that I just don’t understand how limited mobile bandwidth is compared to fibre: but that doesn’t really apply in this case.

Think it through for a moment: to roll out Google Fiber to the entire US is said to cost well north of $100 billion. OK, now let us think about rolling the same technology out to a developing economy of 30 million people (10% of US population). It’s not out of line to suggest that such would cost perhaps 10% of the US cost: $14 billion or so. But no developing country could possibly afford to pay that much. Albania, Papua New Guinea, Cambodia, that sort of cost is more than their entire production each year, more than everyone in the place put together produces. They simply cannot afford to make that leap to fibre broadband for all: whatever effect it might have on the future growth of the economy.

Thus the very much cheaper (even if with less capacity) roll out of mobile broadband makes great economic sense. Thus, as I say at the top, this plan is something we should cheer.

There’s also a little sidepoint that rather amuses me. It’s often said that Henry Ford paid his workers $5 a day so they could be rich enough to buy his cars. This story is, I’m afraid, complete nonsense. But it could end up being true that by developing low cost mobile broadband access, then deploying it, Google could make hundreds of millions of those currently destitute peasants rich enough to contribute to Google’s sales and profits. I know we’re not supposed to celebrate greed as being good these days but my own personal view is that Google can make absolutely as much money as it wants if the byproduct is those hundreds of millions moving up out of absolute poverty.

U.S., company officials: Internet surveillance does not indiscriminately mine data

The director of national intelligence on Saturday stepped up his public defense of a top-secret government data surveillance program as technology companies began privately explaining the mechanics of its use.

The program, code-named PRISM, has enabled national security officials to collect e-mail, videos, documents and other material from at least nine U.S. companies over six years, including Google, Microsoft and Apple, according to documents obtained by The Washington Post.

The disclosures about PRISM have renewed a national debate about the surveillance systems that sprang up after the attacks of Sept. 11, 2001, how broad those systems might be and the extent of their reach into American lives.

In a statement issued Saturday, Director of National Intelligence James R. Clapper Jr. described PRISM as “an internal government computer system used to facilitate the government’s statutorily authorized collection of foreign intelligence information from electronic communication service providers under court supervision.”

“PRISM is not an undisclosed collection or data mining program,” the statement said.

Clapper also said that “the United States Government does not unilaterally obtain information from the servers of U.S. electronic communication service providers. All such information is obtained with FISA Court approval and with the knowledge of the provider based upon a written directive from the Attorney General and the Director of National Intelligence.”

The statement from Clapper is both an affirmation of PRISM and the government’s strongest defense of it since its disclosure by The Post and the Guardian on Thursday. On Wednesday, the Guardian also disclosed secret orders enabling the National Security Agency to obtain data from Verizon about millions of phone calls made from the United States.

Clapper called the disclosures “rushed” and “reckless,” with “inaccuracies” that have left “significant misimpressions.”

“Disclosing information about the specific methods the government uses to collect communications can obviously give our enemies a ‘playbook’ of how to avoid detection,” Clapper said. “Nonetheless, [the law governing PRISM] has proven vital to keeping the nation and our allies safe. It continues to be one of our most important tools for the protection of the nation’s security.”

In responding to the revelations about PRISM, the White House, some lawmakers and company officials have repeatedly suggested that secret court orders are issued every time the NSA or other intelligence agencies seek information under Section 702 of the Foreign Intelligence Surveillance Act. But the orders, which are also secret, serve as one-time blanket approvals for data acquisition and surveillance on selected foreign targets for periods of as long as a year.

The companies have publicly denied any knowledge of PRISM or any system that allows the government to directly query their central servers. But because the program is so highly classified, only a few people at most at each company would legally be allowed to know about PRISM, let alone the details of its operations.

Executives at some of the participating companies, who spoke on the condition of anonymity, acknowledged the system’s existence and said it was used to share information about foreign customers with the NSA and other parts of the nation’s intelligence community.

These executives said PRISM was created after much negotiation with federal authorities, who had pressed for easier access to data they were entitled to under previous orders granted by the secret FISA court.

One top-secret document obtained by The Post described it as “Collection directly from the servers of these U.S. Service Providers: Microsoft, Yahoo, Google, Facebook, PalTalk, AOL, Skype, YouTube, Apple.”

Intelligence community sources said that this description, although inaccurate from a technical perspective, matches the experience of analysts at the NSA. From their workstations anywhere in the world, government employees cleared for PRISM access may “task” the system and receive results from an Internet company without further interaction with the company’s staff.

In intelligence parlance, PRISM is the code name for a “signals intelligence address,” or SIGAD, in this case US-984XN, according to the NSA’s official classified description of PRISM and sources interviewed by The Post. The SIGAD is used to designate a source of electronic information, a point of access for the NSA and a method of extraction. In those terms, PRISM is a not a computer system but a set of technologies and operations for collecting intelligence from Facebook, Google and other large Internet companies.

According to a more precise description contained in a classified NSA inspector general’s report, also obtained by The Post, PRISM allows “collection managers [to send] content tasking instructions directly to equipment installed at company-controlled locations,” rather than directly to company servers. The companies cannot see the queries that are sent from the NSA to the systems installed on their premises, according to sources familiar with the PRISM process.

Crucial aspects about the mechanisms of data transfer remain publicly unknown. Several industry officials told The Post that the system pushes requested data from company servers to classified computers at FBI facilities at Quantico. The information is then shared with the NSA or other authorized intelligence agencies.

According to slides describing the mechanics of the system, PRISM works as follows: NSA employees engage the system by typing queries from their desks. For queries involving stored communications, the queries pass first through the FBI’s electronic communications surveillance unit, which reviews the search terms to ensure there are no U.S. citizens named as targets.

That unit then sends the query to the FBI’s data intercept technology unit, which connects to equipment at the Internet company and passes the results to the NSA.

The system is most often used for e-mails, but it handles chat, video, images, documents and other files as well.

“The server is controlled by the FBI,” an official with one of the companies said. “We do not offer a download feature from our server.”

Another industry official said, “No one wants the bureau logging into the company server.”

On Friday, President Obama defended the secret surveillance program, saying it makes “a difference in our capacity to anticipate and prevent possible terrorist activity.”

Obama said Congress was fully informed about the efforts, which are tightly controlled by legal authorities under FISA. “If every step that we’re taking to try to prevent a terrorist act is on the front page of the newspapers or on television,” he said, “then presumably the people who are trying to do us harm are going to be able to get around our preventive measures.”

Clapper’s statement Saturday emphasized that the program was legal under Section 702 of FISA, as approved by Congress in 2008.

The law governs surveillance of foreign nationals. It was originally passed in 1978, after scandals involving the FBI, IRS and White House during the civil rights movement of the 1960s and the Vietnam War.

Section 702 provides the post-911 legal framework for the “targeted acquisition” of intelligence about foreign persons outside the United States. The information can be obtained only under a FISA court order and a written directive from the attorney general and the director of national intelligence.

Under Section 702, the attorney general and director of national intelligence must show the FISA court that they have procedures “reasonably designed to ensure” that their intercepts will target foreigners “reasonably believed” to be overseas.

“Service providers supply information to the Government when they are lawfully required to do so,” Clapper said Saturday.

The law prohibits officials from intentionally targeting data collection efforts at U.S. citizens or anyone in the United States. The standards for intentional targeting require that an analyst have a “reasonable belief,” at least 51 percent confidence, that the target is a foreign national.

The law also provides “an extensive oversight regime, incorporating reviews by the Executive, Legislative and Judicial branches,” Clapper said in the statement.

One top-secret document shows that the government is making systematic use of PRISM. An internal presentation of 41 briefing slides on PRISM suggested the scale of data collection. It described the system as the most prolific contributor to the President’s Daily Brief, which cited PRISM data in 1,477 items last year. According to the slides and other supporting materials obtained by The Post, “NSA reporting increasingly relies on PRISM” as its leading source of raw material, accounting for nearly one in seven intelligence reports.


Internet traffic was routed via Chinese servers

Nearly 15 percent of the world’s Internet traffic, including that of many U.S. government and military sites, was briefly redirected through computer servers in China in April, according to a congressional commission report due out this week.

It is not clear whether the incident was deliberate, but the capability could enable severe malicious activities including the diversion of data and the interception of supposedly secure encrypted Internet traffic, the U.S.-China Economic and Security Review Commission states in a report to Congress.

A draft copy of the report, which is to be released Wednesday but viewed by The Washington Times, reports for the first time that .gov and .mil websites were affected by the 18-minute-long April 8 redirection, including those for the Senate, all four military services, the office of the secretary of defense, the National Aeronautics and Space Administration, the Department of Commerce, the National Oceanic and Atmospheric Administration “and many others,” as well as commercial websites including those of Dell, Yahoo, Microsoft and IBM.

In effect, Internet traffic to and from those sites was wrongly told that the best route it could take to its destination was through servers in China.

The redirection, though brief, could have enabled “surveillance of specific users or sites [and] … could even allow a diversion of data to somewhere that the user did not intend,” the report states. The huge volume of traffic redirected could have been intended to cover a targeted attack on a single website or user.

“Perhaps most disconcertingly … control over diverted data could possibly allow a telecommunications firm to compromise the integrity of supposedly secure encrypted sessions,” the report adds.

It remains unclear whether the redirection was intentional, the report says, but it demonstrates that it is possible for malicious actors to seize control of the Internet and redirect traffic.

“Evidence related to this incident does not clearly indicate whether it was perpetrated intentionally and, if so, to what ends,” the report says. “Regardless of whether Chinese actors actually intended to manipulate U.S. and other foreign Internet traffic, China’s Internet engineers have the capability to do so.”

The commission notes that Beijing is exercising considerable control over the Internet inside China, and over the limited debate it permits on certain topics on the Web, in an effort to defuse popular demands for reform - a phenomenon it dubs “networked authoritarianism.” The news comes as Google has issued a call to Western governments to challenge Internet censorship as a restraint on global trade.

The report further notes that China has a history of “malicious computer activities” that “raise questions about whether China might seek intentionally to leverage these abilities to assert some level of control over the Internet, even for a brief period.”

Any such attempt, the report states, “would likely be counter to the interests of the United States and other countries.”

“At the very least, these incidents demonstrate the inherent vulnerabilities in the Internet’s architecture,” the report concludes.

Internet traffic moves through the network in small data packets, its route determined by instructions, known as protocols, provided by special servers around the globe.

On April 8, according to Web security specialists, a small Chinese Internet service provider published a set of instructions under the Border Gateway Protocol, that directed Web traffic from about 37,000 networks to route itself via computer servers in China.

The list was republished by China Telecom and briefly propagated itself across the global Web, which works on a trust system, with each server updating its routing instructions based on data provided by others in the network.

“We recommend that Congress get the government to produce a full report every year” about these kinds of incidents, commission member Larry Worzel, a retired colonel in Army intelligence and China specialist, told The Times.

A more comprehensive accounting is needed of how often they occur and how severe they are, he said.

“We see this stuff coming in piecemeal [but the government] has to put into place better analytic tools to spot these kinds of diversions and track which servers and routes are being used,” Mr. Worzel said.

He said China Telecom is not an independent commercial entity, calling it “a surrogate, owned and controlled by the Chinese government.”

The report notes that “China’s leadership, at all levels of the government, increasingly uses the Internet to interact with the Chinese people.” Combined with “strict censorship controls,” this means Beijing could “allow a controlled online debate about certain issues” and then “leverage what it learns from following this debate to construct policies that aim to undercut the most serious irritants to domestic stability.”

The report calls this “networked authoritarianism,” noting several efforts by the authorities to collect opinions from Chinese Internet users, for example, before major national meetings by the government or ruling Communist Party.

In another such effort, in September the Chinese Communist Party’s official newspaper, the People’s Daily, launched a website called “Direct Line to Zhongnanhai,” a reference to the compound that houses China’s president and other senior party figures.

But submission guidelines ban any comment “which harms the state’s honor or interests” or “undermines state policy on religion or advocates heretical organizations or feudal superstitions.”

“These guidelines serve as a window into the government’s efforts to control the boundaries and nature of discussions online,” the report notes.

In a white paper issued Monday, Google Inc., which recently curtailed its activities in China in response to Beijing’s efforts to control the Web, called Internet censorship, non-transparent regulation and online surveillance “the trade barriers of the 21st century economy.”

“In addition to infringing on human rights, governments that block the free flow of information on the Internet are also blocking trade and economic growth,” the Internet-service company said in a statement.


Should Internet Data Be Mined in the Name of National Security?

The Washington Post Thursday published a story detailing a widespread government program that monitors online activities of users of nine major U.S. internet companies. The PRISM surveillance program began in 2007 and has been mining e-mails, documents, photographs and both audio and video chats.

A career intelligence officer provided documents about the program to the Post because of what the paper called the officer's "horror at their capabilities." He said the fact that the government "quite literally can watch your ideas form as you type" is a "gross intrusion of privacy."

The servers of Apple, AOL, Facebook, Google, Microsoft, Yahoo, Skype, YouTube and PalTalk are subject to examination by the court-approved program. Several of these companies denied knowledge of the program, and said they did not knowingly create a "back door" that allowed the government access to their user data.

[Check out our editorial cartoons on President Obama.]

"Google cares deeply about the security of our users' data," said a company spokesman. "We disclose user data to government in accordance with the law, and we review all such requests carefully. From time to time, people allege that we have created a government 'back door' into our systems, but Google does not have a 'back door' for the government to access private user data."

The Post published a 43-slide briefing presentation dated April 2013 that was made for senior analysts at the National Security Agency's Signals Intelligence Directorate, and cited the information collected through the program as an abundant source for the President's Daily Brief. PRISM grew out of President George W. Bush's warrantless domestic surveillance program, but has not been disclosed publicly before now.

The administration defended the program as a vital tool in the fight against terror, and said the court order includes "extensive procedures … to ensure that only non-U.S. persons outside the U.S. are targeted, and that minimize the acquisition, retention and dissemination of incidentally acquired information about U.S. persons."

[See a collection of political cartoons on defense spending.]

"Information collected under this program is among the most important and valuable intelligence information we collect, and is used to protect our nation from a wide variety of threats," said National Intelligence Director James R. Clapper.

The administration, however, was unable to estimate how many Americans' data may have been collected incidentally through the program. The Post report said the information collected is accessed by analysts in Fort Meade, Md. who use search terms intended to produce at least a 51 percent confidence rate in a target's "foreignness," and said analysts were supposed to report any U.S. data collected. It added that training materials said such accidental collection, though, was "nothing to worry about."

[Read the U.S. News Debate: Should Probable Cause Be Required for Police to Use Cell Phone Location Data?]

The American Civil Liberties Union, however, disagrees. "The secrecy surrounding the government's extraordinary surveillance powers has stymied our system of checks and balances," said the ACLU's Washington Legislative Office Director Laura Murphy in a statement. "Congress must initiate an investigation to fully uncover the scope of these powers and their constraints, and it must enact reforms that protect Americans' right to privacy and that enable effective public oversight of our government. There is a time and a place for government secrecy, but true democracy demands that the governed be informed of the rules of play so as to hold elected officials to account."

The Post report Thursday follows a revelation Wednesday by the Guardian that the NSA has also been collecting the data of millions of Verizon customers.

What do you think? Should internet data be mined in the name of national security? Take the poll and comment below.