Posts Tagged technology

EBay to sell Skype stake for $1.9 billion (Reuters)

Tue Sep 1, 2009 10:39am EDT


Photo

«»1 of 3Full Size

NEW YORK (Reuters) – EBay Inc plans to sell a 65 percent stake in its online phone unit Skype for $1.9 billion to private investors including Silver Lake and a venture firm run by Netscape co-founder Marc Andreessen.

Shares in eBay rose 40 cents or 1.8 percent to $22.54 on Nasdaq after the news.

The deal values Skype at $2.75 billion, according to the Internet auction house, which had bought the phone company in 2005 for about $3.1 billion.

The group buying Skype also includes London-based Index Ventures and the Canada Pension Plan, in addition to Silver Lake and Andreessen’s firm Andreessen Horowitz.

The deal lets eBay focus on its PayPal electronic payments service as well as its flagship auction service, the company said.

EBay originally planned to spin off Skype next year. John Donahoe, eBay’s chief executive, said in May that a $2 billion valuation would be low for the growing Internet telephone business.

In 2007, eBay wrote down about $1.4 billion of its investment in Skype, conceding it did not fit in with the rest of its online auction business.

“Skype is a strong standalone business, but it does not have synergies with our e-commerce and online payments business,” Donahoe said in a statement.

EBay expects the deal to close in the fourth quarter. The transaction is not subject to a financing condition.

(Reporting by Sinead Carew in New York and Ajay Kamalakaran in

Bangalore, editing by Will Waterman and Derek Caney)

, ,

No Comments

US cyber-security ‘embarrassing’ (BBC)

By Maggie Shiels
Technology reporter, BBC News, Silicon Valley

sign saying what's in your network

Experts say the threat is increasing fast

America’s cyber-security has been described as “broken” by one industry expert and as “childlike” by another.

The criticism comes as President Obama prepares to release the results of a review he had ordered.

Tim Mather, chief strategist for security firm RSA, told BBC News: “The approach we have relied on for years has effectively run out of steam.”

Alan Paller from security research firm SANS Institute said the government’s cyber defences were “embarrassing”.

The government review, which will outline a way forward, is expected to be opened up for public comment at the end of this month.

At the same time, President Obama is also expected to announce the appointment of a cyber-security tsar as part of the administration’s commitment to make the issue a priority.

For many attending last week’s RSA Conference in San Francisco, the biggest security event of its kind, such focus is welcome.

“I think we are seeing a real breaking point in security with consumers, business and even government saying enough, no more. Let’s rethink how we do this because the system is broken,” said Mr Mather.

‘Laws of procurement’

Over the past couple of weeks, the heat has been turned up on the issue of cyber-security following some high profile breaches.

One involved the country’s power grid which was said to have been infiltrated by nation states. The government subsequently admitted that it was “vulnerable to attack”.

US government computer

The review will provide a roadmap for tackling cyber-security

Meanwhile reports during the RSA Conference surfaced that spies had hacked into the Joint Strike Fighter Project.

The topic is very much on the radar of politicians, who have introduced a number of bills to address security in the virtual world.

One includes a provision to allow the president to disconnect government and private entities from the internet for national security reasons in an emergency.

The latest bill, introduced this week by Senator Tom Carper, has called for the creation of a chief information officer to monitor, detect and respond to threats.

Mr Paller, who is the director of research for SANS, believes the government’s multi-billion dollar budget is the most effective weapon it has to force change.

“The idea of cyber-security leadership isn’t if it’s the White House or DHS (Dept of Homeland Security). It’s whether you use the $70bn you spend per year to make the nation safer.”

He said the best way to ensure that was to require industry to provide more secure technology for federal acquisitions.

“If you want to change things, use the laws of procurement,” suggested Mr Paller.

Hot seat

There is a growing view that the industry is also at a crossroads and has a responsibility to alter the way it operates.

fraud sign

There are 32,000 suspected cyber-attacks every 24 hours

“I think we are more aware of security than ever before,” said Benjamin Jun, vice-president of technology at Cryptography Research.

“We are looking at risk in a new way and the good security practitioners are in the hot seat. It’s time for them to do their job.”

It is also time for them to come up with new technologies that can keep pace with, and move ahead of, the threats that affect the whole of cyberspace, says Asheem Chandna of venture firm Greylock Partners.

“For the evolution of the internet, I think we need the next wave of innovation. The industry clearly needs to step up and deliver the next set of technologies to protect people and stay ahead of the bad guys.”

He also believes the smaller innovative companies in Silicon Valley could help the government be more productive if they were not effectively locked out of the process by the big established firms.

“We want smaller companies that are innovating in Silicon Valley to be given a better chance to help government agencies meet their mandate but the bureaucracy to do this hinders these companies.

“Instead they go to commercial customers because they see the value, they move fast, they see the return on investment and the competitive advantage it can give them. The federal government is more of a laggard in this area,” said Mr Chandna.

‘Silver lining’

There is undoubtedly a consensus that the security of the internet needs to be improved and that attacks are taking their toll on everything from banks to credit card companies and from critical infrastructure to defence.

sign who's your hacker

The president has likened the threat to the internet to that of a nuclear attack

“There is a silver lining to this dark cloud,” said Mark Cohn, the vice-president of enterprise security at security firm Unisys.

“Public awareness, and that among the community and interested parties, has grown tremendously over the last year or two.

“Cyber-security affects us all from national security to the mundane level of identity theft and fraud. But that means society as a whole is more receptive to many of the things we need to do that would in the past have been seen as politically motivated.”

For security firm VeriSign, a shift in how people practise security is what is needed

“Security is a state of mind,” said the company’s chief technology officer, Ken Silva.

“Up until now we have relied on the inefficient system of user names and passwords for security. Those have been obsolete for some time now and that is why our research is focused on making authentication stronger and user friendly.”

To that end, VeriSign has introduced a security application that produces an ever-changing password credential for secure transactions on the iPhone or Blackberry. To date the free app has been downloaded more than 20,000 times.

“It’s one thing to say security is broken, but the consumer doesn’t care until it affects them,” said Mr Silva.

“But if we as an industry want them to use stronger security measures we have to make it easy and more user friendly.”

Indeed, Mr Cohn believes everybody has to play his or her part as the online world becomes increasingly integral to our lives.

“It may seem like we are under attack and the world is more dangerous but in some ways the threat environment is shifting.

“Now the greater concern for people is protecting their information, their identity, their financial security as we move to put more information online like our health records and our social security records.

“We are at a crossroads and this should be viewed as a healthy thing,” said Mr Cohn.

, , , ,

No Comments

Insider risk problem revealed (BBC)

By Maggie Shiels
Technology reporter, BBC News, Silicon Valley

front pages on cyber security

The headlines get the real cyber security threat wrong says the RSA

Security experts have turned the notion that so called “malicious insiders” are the biggest cyber security threat for companies on its head.

The security vendor RSA revealed that the majority of breaches are actually caused unintentionally by employees.

Its survey showed that firms believed 52% of incidents were accidental and 19% were deliberate.

“Unintentional risk gets overlooked, yet it’s the most serious threat to business,” said the RSA’s Chris Young.

“The sexy incident where someone gets arrested for stealing records and selling them to a third party for a lot of money is the stuff that catches the attention of the media, the regulators, executives and Congress people.

“But this is not necessarily where organisations have 100% of the risk,” said Mr Young, the RSA’s senior vice president of products.

The study conducted by the RSA and IT analysts IDC looked at 11 different categories of risk ranging from malware and spyware to employees having excessive access to systems and from unintentional data loss to malicious acts for personal gain.

The report concluded that the difference between the most frequent type of cyber breach – unintentional data loss, at 14.4% per year, and the bottom of the list – internal fraud, at 10.6% – is a clear sign that no single solution can address all potential internal security risks.

It covered over 400 firms from the US, UK, France and Germany across a variety of sectors including the financial industry, healthcare, telecommunications and technology.

‘Weakest link’

The report noted that whether the threats are accidental or deliberate, the cost to a company of a cyber breach is still the same.

The RSA and IDC said disclosure of sensitive information results in regulatory actions, failed audits, litigation, public ridicule and competitive fallout.

fraud sign

Government figures report 32,000 suspected cyber attacks every day

“The figures are hard to quantify, but the average annual financial loss to insider risk adds up to $800,000 (£480,000) overall per organisation in the US and between $300,000-$550,000 (£180,000-£330,000) in the UK, France and Germany.

“And that ties into the billions of dollars range when you think of the thousands of companies that comprise the IT industry,” said Mr Young.

A recent report by the Ponemon Institute found that the average cost of a data breach in 2008 was $202 (£122) per customer record.

The information security firm also determined that the expense continued to rise by 38% between 2004 and 2008.

The RSA and IDC discovered that the weakest link in any company is the temporary employee or contractor.

“They represent the greatest internal risk,” Mr Young told BBC News.

“Most organisations start with a principle of trust and you trust your employees to be able to do their job well and protect the interests of the company. There are always levels of trust which is greater or lesser depending on how closely tied an individual actor is to an individual organisation.

“It’s likely contractors may be less well-trained in organisational policy and it’s harder to maintain control over their access to systems because of the time they interact with an organisation. There is always a tension between letting an employee do his or her job versus security,” said Mr Young.

The Better Business Bureau has drawn up a list of simple things companies should do to secure its data, often regarded as the crown jewels of any company.

It advises limiting systems access to a few trusted employees, using a password protection system for logging in, equipping computers with firewalls and virus protection and educating employees.

, , ,

No Comments

Code-cracking and computers (BBC)

By Mark Ward
Technology correspondent, BBC News

Colossus, BBC

By the end of WWII, 11 Colossus machines were in use

Bletchley Park is best known for the work done on cracking the German codes and helping to bring World War II to a close far sooner than might have happened without those code breakers.

But many believe Bletchley should be celebrated not just for what it ended but also for what it started – namely the computer age.

The pioneering machines at Bletchley were created to help codebreakers cope with the enormous volume of enciphered material the Allies managed to intercept.

The machine that arguably had the greatest influence in those early days of computing was Colossus – a re-built version of which now resides in the National Museum of Computing which is also on the Bletchley site.

Men and machine

The Enigma machines were used by the field units of the German Army, Navy and Airforce. But the communications between Hitler and his generals were protected by different machines: The Lorenz SZ40 and SZ42.

The German High Command used the Lorenz machine because it was so much faster than the Enigma, making it much easier to send large amounts of text.

“For about 500 words Enigma was reasonable but for a whole report it was hopeless,” said Jack Copeland, professor of philosophy at the University of Canterbury in New Zealand, director of the Turing Archive and a man with a passionate interest in the Bletchley Park computers.

Hut 6 during wartime, Bletchley Park Trust

Bletchley employed thousands of code breakers during wartime

The Allies first picked up the stream of enciphered traffic, dubbed Tunny, in 1940. The importance of the material it contained soon became apparent.

Like Enigma, the Lorenz machines enciphered text by mixing it with characters generated by a series of pinwheels.

“We broke wheel patterns for a whole year before Colossus came in,” said Captain Jerry Roberts, one of the codebreakers who deciphered Tunny traffic at Bletchley.

“Because of the rapid expansion in the use of Tunny, our efforts were no longer enough and we had to have the machines in to do a better job.”

The man who made Colossus was Post Office engineer Tommy Flowers, who had instantly impressed Alan Turing when asked by the maverick mathematician to design a machine to help him in his war work.

But, said Capt Roberts, Flowers could not have built his machine without the astonishing work of Cambridge mathematician Bill Tutte.

“I remember seeing him staring into the middle distance and twiddling his pencil and I wondered if he was earning his corn,” said Capt Roberts.

But it soon became apparent that he was.

“He figured out how the Lorenz machine worked without ever having seen one and he worked out the algorithm that broke the traffic on a day-to-day basis,” said Capt Roberts.

“If there had not been Bill Tutte, there would not have been any need for Tommy Flowers,” he said. “The computer would have happened later. Much later.”

Valve trouble

Prof Copeland said Tommy Flowers faced scepticism from Bletchley Park staff and others that his idea for a high-speed computer employing thousands of valves would ever work.

Valves on Colossus, BBC

Colossus kept valves lit to ensure they kept on working

“Flowers was very much swimming against the current as valves were only being used in small units,” he said. “But the idea of using large numbers of valves reliably was Tommy Flowers’ big thing. He’d experimented and knew how to control the parameters.”

And work it did.

The close co-operation between the human translators and the machines meant that the Allies got a close look at the intimate thoughts of the German High Command.

Information gleaned from Tunny was passed to the Russians and was instrumental in helping it defeat the Germans at Kursk – widely seen as one of the turning points of WWII.

The greater legacy is the influence of Colossus on the origins of the computer age.

“Tommy Flowers was the key figure for everything that happened subsequently in British computers,” said Prof Copeland.

After the war Bletchley veterans Alan Turing and Max Newman separately did more work on computers using the basic designs and plans seen in Colossus.

Turing worked on the Automatic Computing Engine for the British government and Newman helped to bring to life the Manchester Small Scale Experimental Machine – widely acknowledged as the first stored program computer.

The work that went into Colossus also shaped the thinking of others such as Maurice Wilkes, Freddie Williams, Tom Kilburn and many others – essentially the whole cast of characters from whom early British computing arose.

And the rest, as they say, is history.

,

No Comments

Spielberg hails Xbox controller (BBC)

Or click here

,

No Comments

Autonomous tech ‘requires debate’ (BBC)

By Jason Palmer
Science and technology reporter, BBC News

Autonomous vehicle at Heathrow (PA)

Fully autonomous rapid transit systems already exist at Heathrow Airport

The coming age of lorries that drive themselves or robots that perform surgery is fraught with legal and ethical issues, says a new report.

The Royal Academy of Engineering says that automated freight transport could be on the roads in as few as 10 years.

Also, it says, robotic surgery will begin to need less human intervention.

But it suggests that much debate is needed to address the ethical and legal issues raised by putting responsibility in the hands of machines.

“We’re all used to automatic systems – lifts, washing machines. We’re talking about levels above that,” said Lambert Dopping-Heppenstal of the Academy’s engineering ethics working group.

“It’s about systems that have some level of self-determination.”

Coming era

Issues surrounding autonomous systems and robots with such self-determination have been discussed for a number years, particularly with regard to the autonomous machines of warfare .

However, the era of autonomous road vehicles and surgeons is slowly becoming reality, making the issues more urgent, the report says.

The removal of direct control from a car’s driver is already happening, with anti-lock braking systems and even automatic parking systems becoming commonplace.

But the next step is moving toward completely driverless road vehicles, which already exist in a number of contexts, including London’s Heathrow Airport.

Robotic surgery console (PA)

The time may come that robotic surgeons operate without human help

The Darpa Grand Challenge, a contest sponsored by the US defence department’s research arm, has driverless cars negotiating traffic and obstacles and obeying traffic rules over courses nearly 100km long.

“Those machines would have passed the California driving test, more than I would have,” said Professor Will Stewart, a fellow of the Academy.

“Autonomous vehicles will be safer. One of the compelling arguments for them is that the machine cannot have an argument with its wife; it can run 24 hours a day without getting tired. But it is making decisions on its own.”

Professor Stewart and report co-author Chris Elliott remain convinced that autonomous systems will prove, on average, to be better surgeons and better lorry drivers than humans are.

But when they are not, it could lead to a legal morass, they said.

“If a robot surgeon is actually better than a human one, most times you’re going to be better off with a robot surgeon,” Dr Elliott said. “But occasionally it might do something that a human being would never be so stupid as to do.”

Professor Stewart concluded: “It is fundamentally a big issue that we think the public ought to think through before we start trying to imprison a truck.”

, ,

No Comments

40 years of Unix (BBC)

By Mark Ward
Technology Correspondent, BBC News

Network cables, BBC

Unix had computer networking built in from the start

The computer world is notorious for its obsession with what is new – largely thanks to the relentless engine of Moore’s Law that endlessly presents programmers with more powerful machines.

Given such permanent change, anything that survives for more than one generation of processors deserves a nod.

Think then what the Unix operating system deserves because in August 2009, it celebrates its 40th anniversary. And it has been in use every year of those four decades and today is getting more attention than ever before.

Work on Unix began at Bell Labs after AT&T, (which owned the lab), MIT and GE pulled the plug on an ambitious project to create an operating system called Multics.

The idea was to make better use of the resources of mainframe computers and have them serve many people at the same time.

“With Multics they tried to have a much more versatile and flexible operating system, and it failed miserably,” said Dr Peter Salus, author of the definitive history of Unix’s early years.

Time well spent

The cancellation meant that two of the researchers assigned to the project, Ken Thompson and Dennis Ritchie, had a lot of time on their hands. Frustrated by the size and complexity of Multics but not its aims of making computers more flexible and interactive, they decided to try and finish the work – albeit on a much smaller scale.

The commitment was helped by the fact that in August 1969, Ken Thompson’s wife took their new baby to see relatives on the West Coast. She was due to be gone for a month and Thompson decided to use his time constructively – by writing the core of what became Unix.

He allocated one week each to the four core components of operating system, shell, editor and assembler. It was during that time and after as the growing team got the operating system running on a DEC computer known as a PDP-7 that Unix came into being.

It got us away from the total control that businesses like IBM and DEC had over us
Peter Salus, author

By the early 1970s, five people were working on Unix. Thompson and Ritchie had been joined by Brian Kernighan, Doug McIlroy and Joe Ossanna.

The name was reportedly coined by Brian Kernighan – a lover of puns who wanted Unics to stand in contrast to its forebear Multics.

The team got Unix running well on the PDP7 and soon it had a long list of commands it could carry out. The syntax of many of those commands, such as chdir and cat, are still in use 40 years on. Along with it came the C programming language.

But, said Dr Salus, it wasn’t just the programming that was important about Unix – the philosophy behind it was vital too.

“Unix was created to solve a few problems,” said Dr Salus, “the most important of which was to have something that was much more compact than the operating systems that were current at that time which ran on the dinosaurs of the computer age.”

Net benefits

Back in the early 1970s, computers were still huge and typically overseen by men in white coats who jealously guarded access to the machines. The idea of users directly interacting with the machine was downright revolutionary.

“It got us away from the total control that businesses like IBM and DEC had over us,” said Dr Salus.

Word about Unix spread and people liked what they heard.

“Once it had jumped out of the lab and out of AT&T it caught fire among the academic community,” Dr Salus told the BBC. What helped this grassroots movement was AT&T’s willingness to give the software away for free.

DEC PDP-1 computer

DEC’s early computers were for many years restricted to laboratories

That it ran on cheap hardware and was easy to move to different machines helped too.

“The fact that its code was adaptable to other types of machinery, in large and small versions meant that it could become an operating system that did more than just run on your proprietary machine,” said Dr Salus.

In May 1975 it got another boost by becoming the chosen operating system for the internet. The decision to back it is laid out in the then-nascent Internet Engineering Task Force’s document RFC 681, which notes that Unix “presents several interesting capabilities” for those looking to use it on the net.

It didn’t stop there. Unix was adapted for use on any and every computer from mainframes to desktops. While it is true that it did languish in the 1980s and 90s as corporations scrapped over whose version was definitive, the rise of the web has given it new life.

The wars are over and the Unix specification is looked after by the Open Group – an industry body set up to police what is done in the operating system’s name.

Now Unix, in a variety of guises, is everywhere. Most of the net runs on Unix-based servers and the Unix philosophy heavily influenced the open source software movements and the creation of the Linux desktop OS. Windows runs the communication stack created for Unix. Apple’s OS X is broadly based on Unix and it is possible to dig into that software and find text remarkably similar to that first written by Dennis Ritchie in 1971.

“The really nice part is the flexibility and adaptability,” said Dr Salus, explaining why it is so widespread and how its ethic fits with a world at home with the web.

“Unix is the best screwdriver ever built,” said Dr Salus.

, ,

No Comments

Bing Search Share Rises, Google And Yahoo Slip (Information Week)

Summer, usually a slow time for search, has given Microsoft something to smile about: The company’s Bing search engine gained market share.

By Thomas Claburn,  InformationWeek
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=219400514

Microsoft’s share of the U.S. search market grew slightly in July, while Google and Yahoo experienced slight declines.

Of the 13.6 billion U.S. searches conducted in July, 64.7% were conducted through Google sites, a 0.3 percent point decline from June, according to ComScore.

Yahoo sites in July served 19.3% of those searches, also a 0.3 percentage point decline from the previous month.

Microsoft Bing’s search share increased by half of a percentage point in July. Its gain accounted for most of what Google and Yahoo lost. Microsoft sites served 8.9% of U.S. searches last month.

As a percentage change, Google’s search query total fell by 4%, Yahoo’s fell by 5%, and Microsoft’s increased by 2%.

Ask and AOL accounted for 3.9% and 3.1% of the search market in July, respectively.

ComScore’s search share figures do not include searches related to mapping, local directory, and user-generated video sites.

While any gain is good news, Microsoft still has a long way to go. In February, prior to Bing’s launch, ComScore put Microsoft’s share of the U.S. search market at 8.5%.

In terms of worldwide search market share, Google processed 78.45% of all searches in July, according to NetApplications. Bing had 3.17%, behind China’s Baidu (8.87%) and Yahoo (7.15%).

Not only does Microsoft have a lot of ground to cover before it draws even with Google, but it also faces a competitor that isn’t standing still.

Google last week unveiled a developer preview of its new Web search architecture called “Caffeine.” The search leader clearly has no intention of letting Bing’s gain go unchallenged.

, , ,

No Comments

Web tool oversees Afghan election (BBC)

By Jonathan Fildes
Technology reporter, BBC News

Kabul campaign posters (AP)

Crowd-sourcing information on the election could ensure its fairness

Any attempt to rig or interfere with Afghanistan’s election could be caught out by a system that allows anyone to record incidents via text message.

The Alive in Afghanistan project plots the SMS reports on an online map.

Citizens can report disturbances, defamation and vote tampering, or incidents where everything “went well”.

Their reports feature alongside those of full-time Afghan journalists to ensure the election process and reporting of it is as “free and fair” as possible.

“We hope to enable people to report on what is going on in the country,” explained Brian Conley, who helped set up the project.

“In the rural areas there are not going to be monitors, and it is questionable how much international media coverage there will be in these areas.”

Additional text and video reports will be added by a network of 80 reporters from the Afghan Pajhwok news agency, he said.

Some will be willing not to eat that evening [in order to be able] to tell the international community what is going on in the country
Brian Conley
Alive in Afghanistan project

Mr Conley said that he hoped the results would be used by national and international media along with members of the international community.

In addition, he said, they may also be sent to the Electoral Commission if there are reports of tampering or rigging.

Content of crowds

The system relies on two established open-source technologies to gather the election reports.

The text messages are collected via a free-platform known as FrontlineSMS, developed by UK programmer Ken Banks.

The system was originally developed for conservationists to keep in touch with communities in national parks in South Africa and allows users to send messages to a central hub.

It has previously been used to monitor elections in Nigeria, and has now been combined with a “crowd-sourced, crisis-mapping” tool known as Ushahidi, which plots the reports on a freely-accessible map.

The system was developed in Kenya when violence erupted following the disputed presidential elections between Mwai Kibaki and Raila Odinga.

Since then, the platform has also been used to document anti-emigrant violence in South Africa and problems in the Democratic Republic of Congo.

Duplicate Afghan voting cards (FEFA)

Thousands of duplicate voting cards were discovered in an investigation

Together they allow reports to be gathered from any part of the country with mobile phone coverage.

Mr Conley hopes “hundreds of thousands of people” will use the system, which has been promoted by distributing “thousands of leaflets” and radio reports.

“I am confident that because of Pajhwok’s support we will see a good amount of content coming in,” he said.

However, he added, the project had to be “realistic about what is possible”.

“In a lot of parts of the country – for whatever reason – people don’t use SMS,” he said. “It is still a developing technology.”

In addition, he said, each text message is relatively expensive, costing the equivalent of two minutes of talk time.

“Even though that is the same amount of money it costs to buy bread for your family people have told me that some will be willing not to eat that evening [in order to be able] to tell the international community what is going on in the country.”

‘Government pressure’

Any content that is sent to the service is cross-checked, he said, to ensure its authenticity.

Reports that are not verified will be marked as such.

In addition to the citizen reports, the map will be populated by reports form a network of journalists from Pajhwok, he said.

The reporters would report “every aspect of the election, good and bad,” he said.

The National Security Council of Afghanistan has asked all domestic and international media agencies to “refrain form broadcasting any incidence of violence during the election process”.

The Foreign Ministry has reportedly told Afghan media organisations that any domestic group defying the ban will be shut down.

“There is lots of pressure from the government not to cover these things,” said Mr Conley.

, , ,

No Comments

The problem with PowerPoint; celebrating 25 years (BBC)

If you have worked in an office in the Western world in the past 25 years, you will probably have sat through a PowerPoint presentation. But there’s a problem. They’re often boring, writes presentation expert Max Atkinson.

In the past 25 years, I’ve asked hundreds of people how many PowerPoint presentations they’ve seen that came across as really inspiring and enthusiastic.

Most struggle to come up with a single example, and the most optimistic answer I’ve heard was “two”.

So what are the main problems?

SCREENS ARE MAGNETS FOR EVERYONE’S EYES

Beware of anyone who says that they’re “just going to talk to some slides” – because that’s exactly what they’ll do – without realising that they’re spending most of their time with their backs to the audience.
Barack Obama
Even Barack Obama needs an autocue on occasion

Yet eye contact plays such a fundamental part in holding an audience’s attention that even as brilliant a speaker as Barack Obama depends on an autocue to simulate it.

So remember that the more slides you have and the more there is on each slide, the more distracting it will it be for the audience – whereas the fewer and simpler the slides are, the easier it will be to keep them listening.

READING AND LISTENING DISTRACTS AUDIENCES

If there’s nothing but text on the screen, people will try to read and listen at the same time – and won’t succeed in doing either very well.

If the print is too small to read, they’ll get irritated at being expected to do the impossible. Nor does it help when speakers say “as you can see”, or the equally annoying “you probably won’t be able to read this”.

SLIDES SHOULDN’T JUST BE NOTES

Few speakers are willing to open their mouths until they have their first slide safely in place. But all too often the slides are verbal crutches for the speaker, not visual aids for the audience.
Conference delegates sleep sweetly
Some presentations prove somewhat less than gripping

Projecting one slide after another might make it look as though you’ve prepared the presentation. But if you haven’t planned exactly what you’re going to say, you’ll have to ad lib and, if you start rambling, the audience will switch off.

To avoid this requires careful planning. Do this before thinking about slides and you won’t need as many of them – and the ones that you do decide to use are more likely to help to clarify things for the audience, rather than just remind you of what to say next.

INFORMATION OVERLOAD

You think bullet points make information more digestible? Think again. A dozen slides with five bullet points on each assumes that people are mentally capable of taking in a list of 60 points. If it’s a 30-minute presentation, that’s a rate of two-per-minute.
Monty Python scene with Frenchmen demonstrating sheep aircraft
This looks a fairly interesting visual aid

This highlights the biggest problem with slide-based presentations, which is that speakers mistakenly think that they can get far more information across than is actually possible in a presentation. At the heart of this is a widespread failure to appreciate that speaking and listening are fundamentally different from writing and reading.

In fact, the invention of writing was arguably the most important landmark in the history of information technology. Before writing, the amount of information that could be passed on to others was severely limited by what could be communicated in purely oral form (ie not much). But the ability to write meant that vast amounts of knowledge could be communicated at previously unimagined levels of detail.

The trouble is that PowerPoint makes it so easy to put detailed written and numerical information on slides that it leads presenters into the mistaken belief that all the detail will be successfully transmitted through the air into the brains of the audience.

THE BULLET POINT PROBLEM

A Microsoft executive recently said that one of the best PowerPoint presentations he’d ever heard had no slides with bullet points on them. This didn’t surprise me at all, because we’ve known for years that audiences don’t much like wordy slides and don’t find them as helpful as pictorial visual aids.

What does surprise me is that so many of the program’s standard templates invite users to produce lists of bullet points, when the program’s main benefits lie in the creation of images. If more presenters took advantage of that, inspiring PowerPoint presentations might become the norm, rather than the exception.

, ,

No Comments