Posts Tagged Interesting Facts

Code-cracking and computers (BBC)

By Mark Ward
Technology correspondent, BBC News

Colossus, BBC

By the end of WWII, 11 Colossus machines were in use

Bletchley Park is best known for the work done on cracking the German codes and helping to bring World War II to a close far sooner than might have happened without those code breakers.

But many believe Bletchley should be celebrated not just for what it ended but also for what it started – namely the computer age.

The pioneering machines at Bletchley were created to help codebreakers cope with the enormous volume of enciphered material the Allies managed to intercept.

The machine that arguably had the greatest influence in those early days of computing was Colossus – a re-built version of which now resides in the National Museum of Computing which is also on the Bletchley site.

Men and machine

The Enigma machines were used by the field units of the German Army, Navy and Airforce. But the communications between Hitler and his generals were protected by different machines: The Lorenz SZ40 and SZ42.

The German High Command used the Lorenz machine because it was so much faster than the Enigma, making it much easier to send large amounts of text.

“For about 500 words Enigma was reasonable but for a whole report it was hopeless,” said Jack Copeland, professor of philosophy at the University of Canterbury in New Zealand, director of the Turing Archive and a man with a passionate interest in the Bletchley Park computers.

Hut 6 during wartime, Bletchley Park Trust

Bletchley employed thousands of code breakers during wartime

The Allies first picked up the stream of enciphered traffic, dubbed Tunny, in 1940. The importance of the material it contained soon became apparent.

Like Enigma, the Lorenz machines enciphered text by mixing it with characters generated by a series of pinwheels.

“We broke wheel patterns for a whole year before Colossus came in,” said Captain Jerry Roberts, one of the codebreakers who deciphered Tunny traffic at Bletchley.

“Because of the rapid expansion in the use of Tunny, our efforts were no longer enough and we had to have the machines in to do a better job.”

The man who made Colossus was Post Office engineer Tommy Flowers, who had instantly impressed Alan Turing when asked by the maverick mathematician to design a machine to help him in his war work.

But, said Capt Roberts, Flowers could not have built his machine without the astonishing work of Cambridge mathematician Bill Tutte.

“I remember seeing him staring into the middle distance and twiddling his pencil and I wondered if he was earning his corn,” said Capt Roberts.

But it soon became apparent that he was.

“He figured out how the Lorenz machine worked without ever having seen one and he worked out the algorithm that broke the traffic on a day-to-day basis,” said Capt Roberts.

“If there had not been Bill Tutte, there would not have been any need for Tommy Flowers,” he said. “The computer would have happened later. Much later.”

Valve trouble

Prof Copeland said Tommy Flowers faced scepticism from Bletchley Park staff and others that his idea for a high-speed computer employing thousands of valves would ever work.

Valves on Colossus, BBC

Colossus kept valves lit to ensure they kept on working

“Flowers was very much swimming against the current as valves were only being used in small units,” he said. “But the idea of using large numbers of valves reliably was Tommy Flowers’ big thing. He’d experimented and knew how to control the parameters.”

And work it did.

The close co-operation between the human translators and the machines meant that the Allies got a close look at the intimate thoughts of the German High Command.

Information gleaned from Tunny was passed to the Russians and was instrumental in helping it defeat the Germans at Kursk – widely seen as one of the turning points of WWII.

The greater legacy is the influence of Colossus on the origins of the computer age.

“Tommy Flowers was the key figure for everything that happened subsequently in British computers,” said Prof Copeland.

After the war Bletchley veterans Alan Turing and Max Newman separately did more work on computers using the basic designs and plans seen in Colossus.

Turing worked on the Automatic Computing Engine for the British government and Newman helped to bring to life the Manchester Small Scale Experimental Machine – widely acknowledged as the first stored program computer.

The work that went into Colossus also shaped the thinking of others such as Maurice Wilkes, Freddie Williams, Tom Kilburn and many others – essentially the whole cast of characters from whom early British computing arose.

And the rest, as they say, is history.

,

No Comments

Spielberg hails Xbox controller (BBC)

Or click here

,

No Comments

Autonomous tech ‘requires debate’ (BBC)

By Jason Palmer
Science and technology reporter, BBC News

Autonomous vehicle at Heathrow (PA)

Fully autonomous rapid transit systems already exist at Heathrow Airport

The coming age of lorries that drive themselves or robots that perform surgery is fraught with legal and ethical issues, says a new report.

The Royal Academy of Engineering says that automated freight transport could be on the roads in as few as 10 years.

Also, it says, robotic surgery will begin to need less human intervention.

But it suggests that much debate is needed to address the ethical and legal issues raised by putting responsibility in the hands of machines.

“We’re all used to automatic systems – lifts, washing machines. We’re talking about levels above that,” said Lambert Dopping-Heppenstal of the Academy’s engineering ethics working group.

“It’s about systems that have some level of self-determination.”

Coming era

Issues surrounding autonomous systems and robots with such self-determination have been discussed for a number years, particularly with regard to the autonomous machines of warfare .

However, the era of autonomous road vehicles and surgeons is slowly becoming reality, making the issues more urgent, the report says.

The removal of direct control from a car’s driver is already happening, with anti-lock braking systems and even automatic parking systems becoming commonplace.

But the next step is moving toward completely driverless road vehicles, which already exist in a number of contexts, including London’s Heathrow Airport.

Robotic surgery console (PA)

The time may come that robotic surgeons operate without human help

The Darpa Grand Challenge, a contest sponsored by the US defence department’s research arm, has driverless cars negotiating traffic and obstacles and obeying traffic rules over courses nearly 100km long.

“Those machines would have passed the California driving test, more than I would have,” said Professor Will Stewart, a fellow of the Academy.

“Autonomous vehicles will be safer. One of the compelling arguments for them is that the machine cannot have an argument with its wife; it can run 24 hours a day without getting tired. But it is making decisions on its own.”

Professor Stewart and report co-author Chris Elliott remain convinced that autonomous systems will prove, on average, to be better surgeons and better lorry drivers than humans are.

But when they are not, it could lead to a legal morass, they said.

“If a robot surgeon is actually better than a human one, most times you’re going to be better off with a robot surgeon,” Dr Elliott said. “But occasionally it might do something that a human being would never be so stupid as to do.”

Professor Stewart concluded: “It is fundamentally a big issue that we think the public ought to think through before we start trying to imprison a truck.”

, ,

No Comments

40 years of Unix (BBC)

By Mark Ward
Technology Correspondent, BBC News

Network cables, BBC

Unix had computer networking built in from the start

The computer world is notorious for its obsession with what is new – largely thanks to the relentless engine of Moore’s Law that endlessly presents programmers with more powerful machines.

Given such permanent change, anything that survives for more than one generation of processors deserves a nod.

Think then what the Unix operating system deserves because in August 2009, it celebrates its 40th anniversary. And it has been in use every year of those four decades and today is getting more attention than ever before.

Work on Unix began at Bell Labs after AT&T, (which owned the lab), MIT and GE pulled the plug on an ambitious project to create an operating system called Multics.

The idea was to make better use of the resources of mainframe computers and have them serve many people at the same time.

“With Multics they tried to have a much more versatile and flexible operating system, and it failed miserably,” said Dr Peter Salus, author of the definitive history of Unix’s early years.

Time well spent

The cancellation meant that two of the researchers assigned to the project, Ken Thompson and Dennis Ritchie, had a lot of time on their hands. Frustrated by the size and complexity of Multics but not its aims of making computers more flexible and interactive, they decided to try and finish the work – albeit on a much smaller scale.

The commitment was helped by the fact that in August 1969, Ken Thompson’s wife took their new baby to see relatives on the West Coast. She was due to be gone for a month and Thompson decided to use his time constructively – by writing the core of what became Unix.

He allocated one week each to the four core components of operating system, shell, editor and assembler. It was during that time and after as the growing team got the operating system running on a DEC computer known as a PDP-7 that Unix came into being.

It got us away from the total control that businesses like IBM and DEC had over us
Peter Salus, author

By the early 1970s, five people were working on Unix. Thompson and Ritchie had been joined by Brian Kernighan, Doug McIlroy and Joe Ossanna.

The name was reportedly coined by Brian Kernighan – a lover of puns who wanted Unics to stand in contrast to its forebear Multics.

The team got Unix running well on the PDP7 and soon it had a long list of commands it could carry out. The syntax of many of those commands, such as chdir and cat, are still in use 40 years on. Along with it came the C programming language.

But, said Dr Salus, it wasn’t just the programming that was important about Unix – the philosophy behind it was vital too.

“Unix was created to solve a few problems,” said Dr Salus, “the most important of which was to have something that was much more compact than the operating systems that were current at that time which ran on the dinosaurs of the computer age.”

Net benefits

Back in the early 1970s, computers were still huge and typically overseen by men in white coats who jealously guarded access to the machines. The idea of users directly interacting with the machine was downright revolutionary.

“It got us away from the total control that businesses like IBM and DEC had over us,” said Dr Salus.

Word about Unix spread and people liked what they heard.

“Once it had jumped out of the lab and out of AT&T it caught fire among the academic community,” Dr Salus told the BBC. What helped this grassroots movement was AT&T’s willingness to give the software away for free.

DEC PDP-1 computer

DEC’s early computers were for many years restricted to laboratories

That it ran on cheap hardware and was easy to move to different machines helped too.

“The fact that its code was adaptable to other types of machinery, in large and small versions meant that it could become an operating system that did more than just run on your proprietary machine,” said Dr Salus.

In May 1975 it got another boost by becoming the chosen operating system for the internet. The decision to back it is laid out in the then-nascent Internet Engineering Task Force’s document RFC 681, which notes that Unix “presents several interesting capabilities” for those looking to use it on the net.

It didn’t stop there. Unix was adapted for use on any and every computer from mainframes to desktops. While it is true that it did languish in the 1980s and 90s as corporations scrapped over whose version was definitive, the rise of the web has given it new life.

The wars are over and the Unix specification is looked after by the Open Group – an industry body set up to police what is done in the operating system’s name.

Now Unix, in a variety of guises, is everywhere. Most of the net runs on Unix-based servers and the Unix philosophy heavily influenced the open source software movements and the creation of the Linux desktop OS. Windows runs the communication stack created for Unix. Apple’s OS X is broadly based on Unix and it is possible to dig into that software and find text remarkably similar to that first written by Dennis Ritchie in 1971.

“The really nice part is the flexibility and adaptability,” said Dr Salus, explaining why it is so widespread and how its ethic fits with a world at home with the web.

“Unix is the best screwdriver ever built,” said Dr Salus.

, ,

No Comments

Bing Search Share Rises, Google And Yahoo Slip (Information Week)

Summer, usually a slow time for search, has given Microsoft something to smile about: The company’s Bing search engine gained market share.

By Thomas Claburn,  InformationWeek
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=219400514

Microsoft’s share of the U.S. search market grew slightly in July, while Google and Yahoo experienced slight declines.

Of the 13.6 billion U.S. searches conducted in July, 64.7% were conducted through Google sites, a 0.3 percent point decline from June, according to ComScore.

Yahoo sites in July served 19.3% of those searches, also a 0.3 percentage point decline from the previous month.

Microsoft Bing’s search share increased by half of a percentage point in July. Its gain accounted for most of what Google and Yahoo lost. Microsoft sites served 8.9% of U.S. searches last month.

As a percentage change, Google’s search query total fell by 4%, Yahoo’s fell by 5%, and Microsoft’s increased by 2%.

Ask and AOL accounted for 3.9% and 3.1% of the search market in July, respectively.

ComScore’s search share figures do not include searches related to mapping, local directory, and user-generated video sites.

While any gain is good news, Microsoft still has a long way to go. In February, prior to Bing’s launch, ComScore put Microsoft’s share of the U.S. search market at 8.5%.

In terms of worldwide search market share, Google processed 78.45% of all searches in July, according to NetApplications. Bing had 3.17%, behind China’s Baidu (8.87%) and Yahoo (7.15%).

Not only does Microsoft have a lot of ground to cover before it draws even with Google, but it also faces a competitor that isn’t standing still.

Google last week unveiled a developer preview of its new Web search architecture called “Caffeine.” The search leader clearly has no intention of letting Bing’s gain go unchallenged.

, , ,

No Comments

Google Gmail Passes AOL, Becoming Third Most Popular E-mail (Information Week)

ByThomas Claburn
InformationWeek

With its growth rate climbing, Gmail is on track to pass Microsoft’s Hotmail in the first quarter of 2010.

Google’s Gmail has surpassed AOL as the third most visited e-mail service in the U.S. and is poised to pass Windows Live Hotmail in about seven months.Between July 2008 and July 2009, Gmail’s number of unique monthly visitors in the U.S. increased from 25.3 million to 36.9 million, according to ComScore.

Gmail’s rate of traffic growth has been increasing, too. In the July 2008 to July 2009 period, Gmail grew at a rate of 46%, up from 39% during the period between September 2007 and September 2008.From July 2008 to July 2009, AOL’s monthly visitor total declined by 19%, from 45.1 million to 36.4 million. Windows Live Hotmail, which lost 4% of its visitors between September 2007 and September 2008, managed to eke out a 3% gain during the July 2008 to July 2009 period.

But with 47.1 million monthly visitors, Windows Live Hotmail is more or less where it was in September 2007, when its monthly visitor share stood at 46.2 million.

If current trends continue, Gmail should surpass Hotmail by the end of February next year and take second place in visitor traffic behindYahoo Mail. The release of Windows 7, however, may contribute to renewed interest in Microsoft services like Hotmail and may delay Gmail’s move to second place.

Yahoo Mail, the leading free e-mail service, has been doing better lately. Its visitor traffic, 106.1 million last month by ComScore’s count, grew at a rate of about 11% in 2008 and at a rate of 22% between July 2008 and July 2009.

There are of course other metrics by which one can measure the popularity of e-mail services, like the number of registered accounts. Online traffic however can be correlated with active usage.

Gmail’s torrid growth coincides with a period of aggressive innovation. Google has delivered new Gmail features and capabilities every week, more or less, since the opening of Gmail Labs in June last year.

Google has also been encouraging businesses to start using Google Apps, which includes Gmail as well as online applications like Google Docs, Google Calendar, Google Sites, and Google Video.

Google did not immediately respond to a request to confirm ComScore’s figures.

, , ,

1 Comment

GM Claims Chevy Volt Will Get 230 MPG–But How? (Popular Science)

General Motors CEO Fritz Henderson says the EPA will certify the Chevrolet Volt with triple-digit mileage. How’d they come up with that?

The 2011 Chevrolet Volt General Motors

[Update: The EPA issued a statement to the folks at Edmunds stepping back from GM's mileage claim: "The EPA has not tested a Chevy Volt and therefore cannot confirm the fuel economy values claimed by GM. EPA does applaud GM's commitment to designing and building the car of the future - an American-made car that will save families money, significantly reduce our dependence on foreign oil and create good-paying American jobs."]

General Motors calls the Chevrolet Volt an extended-range electric vehicle. That’s because the only motive force comes from the electric motor; the gas engine only charges the batteries. In a press conference earlier today, GM’s CEO Fritz Henderson said the Volt will have a city mileage figure of 230 miles per gallon–almost five times more efficient than a Prius. But considering the uniqueness of the Volt’s powertrain, how did the EPA get that figure?

Call it a “draft methodology.” That’s a quick way of saying the EPA is developing a few assumptions to populate a new “duty cycle” for the Volt. The duty cycle is the usage profile the agency uses when determining the city and highway mileage numbers to put on a new car’s window sticker. The latest EPA cycle, set in 2006, accounts for actual driving conditions, such as high speed, aggressive driving, use of air conditioning, and cold temperature operation.

Of course, the Volt’s fuel-consumption parameters are a bit more complex. Motor Trend reported a while back that such complexity had put GM and the EPA at odds over how to calculate the Volt’s mileage. Apparently by today’s statement from CEO Henderson (and all those “230″ ads you’ve been seeing and didn’t know it), GM and the EPA have apparently come to terms.

As John Voelcker from GreenCarReports.com points out, GM says the Volt can travel for the first 40 miles on battery power alone. That means, if you never drive more than 40 miles a day, your mileage is technically “infinity.” Of course, that isn’t quite accurate over longer distances. So the EPA likely adopted a test cycle that involves driving the Volt until the battery is discharged, and then for a further distance using gasoline power.

GM-Volt.com reports on a similar test routine proposed by Mike Duoba at Argonne National Laboratories, during which the Volt is driven repeatedly on four EPA highway test cycles until the battery is discharged, then drives one city cycle, totaling 51 miles. The EPA city cycle is just under 11 miles, the highway cycle about 10.26 miles. If you do the math, as Voelcker has, it works out to 232 mpg. Sounds familiar.

We’ll be watching to see when the EPA gives up the goods.

,

1 Comment

Mobile data show friend networks (BBC)

Representation of mobile data survey (Stephen Guerin, Redfish Group)

Movement and call data showed a different picture of connectivity than surveys

Friendships can be inferred with 95% accuracy from call records and the proximity of users, says a new report.

Researchers fitted 94 mobiles in the US with logging software to gather data.

The results also showed that those with friends near work were happier, while those who called friends while at work were less satisfied.

The data, published in Proceedings of the National Academy of Sciences, showed a marked contrast with answers reported by the users themselves.

“We gave out a set of phones that were installed with a piece of ‘uber-spyware’,” said the study’s lead author Nathan Eagle, now at the Santa Fe Institute.

“It’s invisible to the user but logs everything: communication, users’ locations, people’s proximity by doing continuous Bluetooth scans.”

The researchers then compared the data with results from standard surveys given to the mobile users – and found, as the social sciences have found time and again, that people reported different behaviour than the mobile data revealed.

“What we found was that people’s responses were wildly inaccurate,” Dr Eagle told BBC News.

Mobile phone data are fantastic complements to the existing, very deep survey literature that the social sciences already have
Nathan Eagle
Santa Fe Institute

“For people who said that a given individual was a friend, they dramatically overestimated the amount of time they spent. But for people who were not friends, they dramatically underestimated that amount of time.”

The researchers were able to guess from the mobile data alone, with 95% accuracy, if any given pair of users were friends.

An analysis of the overall proximity of a given user to his or her friends – maximised if they worked together – was correlated to people who reported a high level of satisfaction at work.

Conversely, those who made calls to their friends while working were found to report lower levels of satisfaction at work.

Wide application

One principal question of such a small sample size, made up exclusively of students from the Massachusetts Institute of Technology, is how much the results really mean in a sociology context.

However, the group has gone on to carry out a larger study that just finished, comprising 1,000 people in Helsinki, Finland.

There is also an ongoing trial of the approach in Kenya, which Dr Eagle said includes participants ranging from computer science students to people who had never used a phone before.

Nokia 6600 (Nokia)

Standard Nokia 6600 handsets were fitted with “uber-spyware”

Dr Eagle sees the approach not as a means to supplant but rather to supplement traditional measures.

“Mobile phone data are fantastic complements to the existing, very deep survey literature that the social sciences already have,” he said.

Moreover, he sees it not just as a means to map out the networks of friends that mobile users might have, but to carry on this “reality mining” in contexts ranging from the modelling of the spread of disease to the design of urban spaces.

“We were capturing data when the Red Sox won the [baseball championship] World Series for the first time,” Dr Eagle recounted.

“Suddenly all our subjects became unpredictable; they all flooded into downtown Boston to a rally in the centre of the city.

“City planners approached us because they wanted to know how people were using urban infrastructure, to know when the people left the rally, how many walked across the bridge and how many took the subway, how many biked or took the bus.

“We can give them some real insight with the idea of helping them build a better city that reflects people’s actual behaviour.”

, ,

No Comments

Alarm sounded over game futures (BBC)

By Daniel Emery
BBC technology reporter, Edinburgh

Screenshot from Tiger Woods PGA Tour Online, EA

EA is experimenting with novel ways for players to pay for games

A stark warning about the finances of the games industry has been aired at the Edinburgh Interactive conference.

The sector had suffered “significant disruption” to its business model, Edward Williams, from BMO Capital Markets told the industry gathering.

“For Western publishers, profitability hasn’t grown at all in the past few years and that’s before we take 2009 into account,” he said.

By contrast, he said, Chinese firms were still seeing improved profits.

What makes the difference between Western firms and Chinese developers was the way they went about getting products to players.

Western publishers, said Mr Williams, still relied on the traditional develop methods of putting a game on a DVD and then selling that through retail channels.

Chinese developers focussed primarily on the PC market and used direct download, rather than retail stores, to get games to consumers.

Those Chinese developers were also helped by the low number of console users in South East Asia which meant developers there did not have to pay royalties to console makers.

Future models

Three factors, said Mr. Williams, were forcing the operating costs of Western firms to spiral upwards:

• Games are getting larger, which meant longer development time and larger staff costs.

• After its release in the 1990s the PlayStation accounted for 80% of the market. Today the console space is very fragmented, so developers have to work on many platforms at any one time.

• The cost of licensing intellectual property or gaining official sports body endorsement (such as FIFA or FIA) has gone up.

These factors, said Mr. Williams, explained the stagnation in overall profitability despite sales in the games sector increasing by $30bn (£24.17bn) over the past four years.

Recent figures suggest sales are also coming under pressure. US game sales fell by 29% in the last 12 months suggest statistics from research group NPD.

PS2 console, AP

The PlayStation no longer dominates pushing up costs for game makers

Speaking to the BBC, Peter Moore – president of EA Sports – said that while the Chinese and Western markets were still very different, he expected to see some significant changes in the way Westerners buy games in the future.

“In China, PC and mobile platforms will continue to dominate,” he said. “There isn’t the necessity to buy other pieces of hardware and it is our job to service that.”

“In Europe we are going to see more content that’s delivered electronically, be that through Steam, Xbox Live or whatever.”

Mr Moore added that while this may have some impact on retailers, the future of the high street shop was still bright, especially if you factor in sales of hardware, peripherals and game-time cards.

“The release of Tiger Woods online as a free to play experience will be the real test of the Western consumer’s appetite for digital downloading,” he said.

The game, scheduled for release in late 2009, has a segment which gamers can play for free online but can also pay for additional content as required.

Now in its sixth year, the Edinburgh Interactive Conference brings together industry figures, developers, publishers and the media to discuss issues facing the interactive game sector and to try to promote creativity.

, , ,

1 Comment

Microsoft Word Ban Sparked By 4-Page Complaint (Information Week)

By InformationWeek
August 13, 2009 07:27 AM

Unlike the lengthy tomes that often kick start major lawsuits, the formal legal complaint that ultimately led a judge to impose a ban on U.S. sales of Microsoft (NSDQ: MSFT) Word–effective in 60 days–runs just four pages.At its heart is an allegation that Word and other Microsoft technologies, including Windows Vista and .NET Framework, violate an obscure patent that governs how computer programs manipulate certain information within a document.

Judge Leonard Davis, of U.S. District Court for Eastern Texas, on Tuesday ruled that Word—but not Vista and .Net—does indeed step on U.S. patent 5,787,449, which describes a “Method and System for Manipulating the Architecture and the Content of a Document Separately from Each Other,” according to court records.The patent is held by an equally obscure tech firm based in Toronto—i4i, Inc. The company describes itself as a developer of “collaborative content solutions.”

In its complaint, originally filed March 6, 2007, i4i claimed Microsoft infringed its patent “by making, using, selling, offering to sell, and/or importing in or into the United States, without authority, Word 2003, Word 2007, .NET Framework, and Windows Vista.”

Davis in his ruling said Microsoft Word had indeed “unlawfully infringed” on i4i’s patent. With that, he enjoined Redmond from selling or supporting new copies of Word 2003 and Word 2007 in the U.S. The ban would take effect in mid-October. Davis also ordered Microsoft to pay i4i more than $240 million in damages and costs.

Microsoft, not surprisingly, said it strongly disagrees with Davis’ ruling and plans to appeal the order.

Davis left an out for Microsoft. He noted that the infringing aspect of Word is its ability to open and read documents that contain custom XML—a form of the Extensible Markup Language forma that businesses create to forge links between their back office data and PC applications like Word.

Davis said any version of Word that opens documents in plain text only, or which strips a document of custom XML through a process known as a transform, would be free from his order. That leaves the door open for Microsoft to issue a patch that alters Word’s functionality in such a way as to circumvent the ban.

The company took a similar tack with Vista after European regulators found that the bundling of the operating system with Windows Media Player violated competition rules. Microsoft in response created a version of Vista for sale on the Continent that does not include WMP.

A third option for Microsoft is to settle the case with i4i by purchasing rights to its technology.

What’s not in doubt is that the stakes are high for Redmond. Microsoft Office, which includes Word, accounted for more than $3 billion in worldwide sales in Microsoft’s most recent fiscal year. So any prolonged ban on Microsoft Word sales could severely impact the company’s top line.

,

1 Comment