Human predictions about AI winning games are wrong

When Kasparov challenged the IBM chess-playing computer called Deep Blue, he was absolutely certain that he would win.  An article in USA Today on 2 May 1997 quoted him as saying “I’m going to beat it absolutely.  We will beat machines for some time to come.

He was beaten conclusively.

In early 2016 another landmark was reached in game-playing computing, when AlphaGo (DeepMind) challenged Lee Se-dol to a game of Go.  The Asian game is a magnitude more complex than chess, and resulted in Lee making the observation that “AlphaGo’s level doesn’t match mine.”

Other expert players backed Lee Se-dol, saying that he would win 5-0.  In the end he only won a single game.

Now the same team that developed AlphaGo is setting it’s sights on a computer game called StarCraft 2. This is a whole new domain for artificial intelligence because, as The Guardian points out:

StarCraft II is a game full of hidden information. Each player begins on opposite sides of a map, where they are tasked with building a base, training soldiers, and taking out their opponent. But they can only see the area directly around units, since the rest of the map is hidden in a “fog of war”.

“Players must send units to scout unseen areas in order to gain information about their opponent, and then remember that information over a long period of time,” DeepMind says in a blogpost. “This makes for an even more complex challenge as the environment becomes partially observable – an interesting contrast to perfect information games such as Chess or Go. And this is a real-time strategy game – both players are playing simultaneously, so every decision needs to be computed quickly and efficiently.

Once again, humans believe that the computer cannot beat humans.  In the Guardian article, the executive producer for StarCraft is quoted as saying “I stand by our pros. They’re amazing to watch.”

Sound familiar?

If AI can win at a game like StarCraft, it’s both exciting and troubling at the same time.

It will mean that an AI will have to reference ‘memory,’ take measured risks and develop strategy in a manner that beats a human. These three things – pattern recognition (from memory), risk taking, and strategy, are skills that command a premium wage in economies that value ‘knowledge workers.’

In 2015 a research team at Oxford University published a study predicting 35% of current jobs are at “high risk of computerisation over the following 20 years.”  The StarCraft challenge might cause them to revise this prediction upwards.

Economist on the relevance of the blockchain

 

 

 

 

If you are not familiar with the BlockChain, the Economist has an excellent primer on it which goes beyond the simple first-mover of BitCoin.

The graphic below is a good explanation about how the chain is built, and how it’s kept unique.

Towards the end of the article is a section that nails why it’s important beyond currency:

One of the areas where such ideas could have radical effects is in the “internet of things”—a network of billions of previously mute everyday objects such as fridges, doorstops and lawn sprinklers. A recent report from IBM entitled “Device Democracy” argues that it would be impossible to keep track of and manage these billions of devices centrally, and unwise to to try; such attempts would make them vulnerable to hacking attacks and government surveillance. Distributed registers seem a good alternative.

The sort of programmability Ethereum offers does not just allow people’s property to be tracked and registered. It allows it to be used in new sorts of ways. Thus a car-key embedded in the Ethereum blockchain could be sold or rented out in all manner of rule-based ways, enabling new peer-to-peer schemes for renting or sharing cars. Further out, some talk of using the technology to make by-then-self-driving cars self-owning, to boot. Such vehicles could stash away some of the digital money they make from renting out their keys to pay for fuel, repairs and parking spaces, all according to preprogrammed rules.

 

Source: The great chain of being sure about things

The implications of quantum computing

At the last Foresight Week event in Singapore two years ago, Peter Schwartz and I had a long discussion about the implications of quantum computing. Where we ended up was that we thought that there was a ‘computing arms race’ developing between Governments and consumers.

At the highest abstract levels, the foundations of computing have remained unchanged since the development of the transistor.  The development of the PC meant that it was inevitable that consumers would possess extremely fast computers, and among other things these would enable levels of security and privacy through encryption.  No matter how fast Government computers became, there would be enough horsepower available to consumers to secure their privacy.

Now this is changing.  The development of the quantum computer means that the next evolution of computing will put the average person into a state of inherent insecurity, because quantum computers will be able to unlock any security currently in use.  An article in the Washington Post highlights this:

Quantum mechanics is now being used to construct a new generation of computers that can solve the most complex scientific problems—and unlock every digital vault in the world. These will perform in seconds computations that would have taken conventional computers millions of years.

This also means that Governments and corporations will once more be leaders in computing, harking back to the days of mainframe computing – when state-of-the-art computation power was unaffordable to the average person.  However unlike the democratisation of computing power that has taken place since the development of the desktop, it’s likely to be a much shorter time span before quantum computing is available in the home – or in your pocket.

In the meantime however, the deployment of this new type of computing is likely to add to global volatility through it’s deployment by security agencies.

Article: The third industrial revolution

A quick link to an article in The Economist on a topic that we’ve explored many times for different clients, starting back in 2007 for the Shell Technology Futures programme.

The factory of the future will focus on mass customisation—and may look more like those weavers’ cottages than Ford’s assembly line.

via Manufacturing: The third industrial revolution | The Economist.

Radio Interview on Cashless Societies (ABC Australia Future Tense)

Over the weekend ABC Australia played a programme about the rise (or otherwise) of cashless society.  It contained an interview with me about my experience of technologies, and specifically about my experience at egg (the UK branchless bank) in the early 2000s.  Here’s the blurb (and a link at the bottom)

We hear a lot about the cashless society and the death of the local bank branch—as commerce becomes increasingly digital. But how close are we to a completely cashless environment? Is it still possible to live a whole year without those little pieces of paper or polymer we carry in our pockets? We look at the rate of change when it comes to money’s digital future and whether all of us are heading for a cashless future at the same speed.

via Money, banks and our changing times – Future Tense – ABC Radio National Australian Broadcasting Corporation.

The Economist on 3D printing and making

A short snippet from an article in the Economist that links a new hobby of ‘tinkering’ with possible disruption.  This is something that we could easily see a few years back and identified as part of the Shell Technology Futures programme in 2007.  It’s fascinating to see it unfolding:

“The tools of factory production, from electronics assembly to 3D printing, are now available to individuals, in batches as small as a single unit,” writes Chris Anderson, the editor of Wired magazine.

It is easy to laugh at the idea that hobbyists with 3D printers will change the world. But the original industrial revolution grew out of piecework done at home, and look what became of the clunky computers of the 1970s. The maker movement is worth watching.

via Monitor: More than just digital quilting | The Economist.

3D printing just got more interesting

Quick update that has implications for distributed manufacturing, supply chains and last, but not least, intellectual property:

The Pirate Bay, announced a new, legitimate direction yesterday: It’s going to host physibles, downloadable models for constructing 3D objects.

 The Pirate Bay’s move into physibles breaks new ground, since 3D printing is territory copyright lawyers have barely begun to fathom.

A “physible” is a digital plan for an object that can either be designed on a computer or uploaded with a 3D scanner. Those plans can be downloaded and used to assemble real, tangible objects using a 3D printer. Printers are getting more affordable, but they’re still limited by the kinds of materials they can use. But that just means it’s the dawn of this technology, and The Pirate Bay is getting in early. “We believe that in the nearby future you will print your spare [parts] for your vehicles,” TPB writes on its blog. “You will download your sneakers within 20 years.”

via Forget MP3s: Soon You’ll Download Your Sneakers From The Pirate Bay.

Phones created multiplayer games in real worlds

 

 

File this under “the future is already here, it’s just unevenly distributed.”

Using the viewfinders of their smartphones, gamers can view paranormal activity layered over their surrounding environment and join a massive multi-player game that requires completing location-based missions and casting spells on real-world locations. Missions are generated in any real world location, asking players to complete challenges in order advance the story line, gain new spells, and earn status points. The game can be played anywhere in the world, enabling multiple players to compete and collaborate in the global battle between good and evil.

Read more about this fascinating combination of technologies in an interview with the developers at PSFK here: Game Creates Worldwide Zombie Hunt Using Augmented Reality.

Immersion, reality, zombies and fitness

The wonderful London gaming studio Six to Start is working on a project that has been funded by Kickstarter. It’s a game called Zombies, Run!, and is an augmented audio running game for the iPhone, iPod Touch and Android that challenges users to rebuild civilization after a zombie apocalypse by completing location-specific tasks while running in the real world.

Users cue the app and don headphones to collect medicine, ammo, batteries, and spare parts which can be used to build up and expand their base — all while getting orders, clues, and a story through their headphones. Missions last around 20-30 minutes and can be played in any city. The platform additionally records the distance, time, pace, and calories burned during all runs.

This is a wonderful mix of many interesting trends: crowdsourced funding, augmented reality, and mobile computing combining to create a game with real world goals.

via Augmented Audio Game Spurs Fitness By Immersing Runners In Zombie Infested World @PSFK.

Societal, technological and organisational change

Every so often I read something which stops me in my tracks.  “A Long-Wave Theory on Today’s Digital Revolution”  on the Booz & Co Strategy and Business site falls squarely into this category.

It’s an interview with historian Elin Whitney-Smith and has a range of insights that are worth sharing.   Whitney Smith has spent 30 years researching and refining her theory of economic progress as a series of information technology disruptions, drawing on studies of subjects as varied as digital media design, medieval gender relationships, and the extinctions at the end of the Pleistocene epoch.

Her theory is that:

There have been six information revolutions in human history. Each represents a major change in the organizational paradigm — a change in how people form themselves into groups.

  • The first was among hunter–gatherers just before the invention of agriculture;
  • second, the rise of counting and written language;
  • third, the fall of Rome;
  • fourth, the invention of the printing press;
  • fifth, the electric information revolution that accompanied trains, telegraph, and telephone; and sixth, the digital information revolution that we are now living through.

In the last three, the economics follow the same pattern: a long boom followed by a crash. Then a difficult and turbulent struggle begins. New ways of organizing emerge and the old ways, supported by established elites, fail.

This has close parallels with the theory of technology innovation as proposed to Ray Kurzweil, and has led him to propose his theory of ‘the singularity’ where humans and machines merge.  Kurzweil’s theory is that each technology wave – from the discovery of fire –  has happened successively faster.  Whitney-Smith makes a similar observation:

Throughout history, the time frame has gotten shorter. Among hunter–gatherers, it took thousands of years to make the transition to agriculture. From the fall of Rome to the press was almost 1,000 years. The printing press revolution took 220 years. The electric revolution [trains, telegraph, and telephone] took 110 years, and, as I count it, the digital revolution started about 50 years ago. So, in recent information revolutions, there is a kind of rule of halves.

According to Whitney-Smith this has wide ranging implications, including changes for organizations:

We’re just starting to see the organizational innovation of the second phase emerge. These new companies take the Internet for granted. They are designed by a generation that had access to computers from childhood. Businesses that are less bound by old forms of hierarchical authority, such as Facebook (where any engineer can modify any part of Facebook’s code base), are thriving. So are companies with massive line worker input such as the “open management”

…companies that use these new ways of organizing will out-compete the old. If the rule of halves still applies, we would expect this new information order to manifest itself by sometime around 2012.

This is supported by evidence that companies are already embracing a ‘co-creation’ framework rather than a top down approach.  For example I’m working with a number of forward-thinking clients on the deployment of Spigit  – an online idea management tool which empowers everyone in an organization (especially front-line workers).

Whitney-Smith’s theory also has implications on a global scale:

In the short run, it’s better to be a member of the elite in China than a college student elsewhere with free information access. But bottom-up innovation will always be more successful in the long run. Therefore, if China continues its closed information policy, its success won’t last because regular people won’t be able to innovate.

Last but not least, the theory weighs in on the importance of moving away from the core to look for changes at the periphery and the edges:

“Lasting innovation in an information revolution doesn’t come from the elite, or from people who already have access to wealth and authority. It comes from the edges…”