13. November 2015
Source: The Guardian
Analysts warn that automation is now affecting mental labour as well as physical. So what tasks are vulnerable?
Fear of mass unemployment has been proved wrong as automation makes the economy stronger
The fear that robots will destroy jobs and leave a great mass of people languishing in unemployment is almost as old as automation itself. And yet, from the Luddites onwards, the fears have been eventually proved wrong, and the economy has ended up stronger than before.
But more and more analysts worry that this may be about to change. And on Thursday the Bank of England’s chief economist warned that this wave of automation is threatening skilled roles. The jobs of the middle classes, with their expensive university educations, are now at risk. As a result, a huge number of jobs that were previously thought safe from machine-led disruption are firmly in the firing line. Read the rest of this entry »
11. October 2015
Source: Fortune january 2015, Ram Charan
Get ready for the most sweeping business change since the Industrial Revolution.
The single greatest instrument of change in today’s business world, and the one that is creating major uncertainties for an ever-growing universe of companies, is the advancement of mathematical algorithms and their related sophisticated software. Never before has so much artificial mental power been available to so many—power to deconstruct and predict patterns and changes in everything from consumer behavior to the maintenance requirements and operating lifetimes of industrial machinery. In combination with other technological factors—including broadband mobility, sensors, and vastly increased data-crunching capacity—algorithms are dramatically changing both the structure of the global economy and the nature of business.
Though still in its infancy, the use of algorithms has already become an engine of creative destruction in the business world, fracturing time-tested business models and implementing dazzling new ones. The effects are most visible so far in retailing, creating new and highly interactive relationships between businesses and their customers, and making it possible for giant corporations to deal with customers as individuals. At Macy’s, for instance, algorithmic technology is helping fuse the online and the in-store experience, enabling a shopper to compare clothes online, try something on at the store, order it online, and return it in person. Algorithms help determine whether to pull inventory from a fulfillment center or a nearby store, while location-based technologies let companies target offers to specific consumers while they are shopping in stores. Read the rest of this entry »
21. September 2015
Source: The Wall Street Journal
Computers govern how long the microwave heats food or the dryer spins clothes.
Can they learn to form ideas and theories about the world around them as well?
In a particularly memorable episode of CBS’s “The Big Bang Theory,” physicist Sheldon Cooper and neurobiologist Amy Farrah Fowler get into an argument, a game of intellectual one-upmanship that threatens their relationship. Sheldon claims that “a grand unified theory, insofar as it explains everything, will ipso facto explain neurobiology.” Amy counters: “Yes, but if I’m successful, I will be able to map and reproduce your thought process in deriving a grand unified theory and therefore subsume your conclusions under my paradigm.”
The first contention is a familiar one—the second more surprising. But could it be true? Pedro Domingos, a computer scientist at the University of Washington, believes that a version of Amy’s notion is indeed true. All knowledge could be reproduced—and new knowledge produced—by “subsuming” human thought processes. And he thinks computer scientists are well on their way to doing it. Read the rest of this entry »
31. August 2015
Source: Fast Company
GIVEN THE VAST AMOUNTS OF DATA GOOGLE HAS ON US THROUGH OUR SEARCHES, IT’S A WONDER THEY HAVEN’T DONE THIS SOONER.
It’s been the subject of a feature film, a main theme of a best-selling book, a source of endless speculation and analysis (yielding 21 million results on the search “how google hires”), and a holy grail-like quest for some two million hopefuls per year.
It’s the hiring process at Google.
While the search giant has been known to deploy quirky recruitment tactics, from banners and billboards blazed with a mathematical riddle aimed to entice engineers or the brainteasers about golf balls or school buses. The latter tactics, admitted Google’s head of people operations, Laszlo Bock, were “a complete waste of time,” while the former didn’t net the company any new hires. Read the rest of this entry »
24. January 2015
New technology tools are making adoption by the front line much easier, and that’s accelerating the organizational adaptation needed to produce results.
The world has become excited about big data and advanced analytics not just because the data are big but also because the potential for impact is big. Our colleagues at the McKinsey Global Institute (MGI) caught many people’s attention several years ago when they estimated that retailers exploiting data analytics at scale across their organizations could increase their operating margins by more than 60 percent and that the US healthcare sector could reduce costs by 8 percent through data-analytics efficiency and quality improvements.1
Unfortunately, achieving the level of impact MGI foresaw has proved difficult. True, there are successful examples of companies such as Amazon and Google, where data analytics is a foundation of the enterprise.2 But for most legacy companies, data-analytics success has been limited to a few tests or to narrow slices of the business. Very few have achieved what we would call “big impact through big data,” or impact at scale. For example, we recently assembled a group of analytics leaders from major companies that are quite committed to realizing the potential of big data and advanced analytics. When we asked them what degree of revenue or cost improvement they had achieved through the use of these techniques, three-quarters said it was less than 1 percent. Read the rest of this entry »
11. October 2014
Source: Project Syndicate
Nathan Eagle is the CEO of Jana, a World Economic Forum Technology Pioneer.
BOSTON – Nearly everyone has a digital footprint – the trail of so-called “passive data” that is produced when you engage in any online interaction, such as with branded content on social media, or perform any digital transaction, like purchasing something with a credit card. A few seconds ago, you may have generated passive data by clicking on a link to read this article.
Passive data, as the name suggests, are not generated consciously; they are by-products of our everyday technological existence. As a result, this information – and its intrinsic monetary value – often goes unnoticed by Internet users.
But the potential of passive data is not lost on companies. They recognize that such information, like a raw material, can be mined and used in many different ways. For example, by analyzing users’ browser history, firms can predict what kinds of advertisements they might respond to or what kinds of products they are likely to purchase. Even health-care organizations are getting in on the action, using a community’s purchasing patterns to predict, say, an influenza outbreak.
Indeed, an entire industry of businesses – which operate rather euphemistically as “data-management platforms” – now captures individual users’ passive data and extracts hundreds of billions of dollars from it. According to the Data-Driven Marketing Institute, the data-mining industry generated $156 billion in revenue in 2012 – roughly $60 for each of the world’s 2.5 billion Internet users. Read the rest of this entry »
9. October 2014
Source: Technology Review
If you’ve ever struggled to make sense of an information firehose, perhaps a 3-D printed model could help.
One of the characteristics of our increasingly information-driven lives is the huge amounts of data being generated about everything from sporting activities and Twitter comments to genetic patterns and disease predictions. These information firehoses are generally known as “big data,” and with them come the grand challenge of making sense of the material they produce.
That’s no small task. The Twitter stream alone produces some 500 million tweets a day. This has to be filtered, analyzed for interesting trends, and then displayed in a way that humans can make sense of quickly.
It is this last task of data display that Zachary Weber and Vijay Gadepally have taken on at MIT’s Lincoln Laboratory in Lexington, Massachusetts. They say that combining big data with 3-D printing can dramatically improve the way people consume and understand data on a massive scale.
They make their argument using the example of a 3-D printed model of the MIT campus, which they created using a laser ranging device to measure the buildings. They used this data to build a 3-D model of the campus which they printed out in translucent plastic using standard 3-D printing techniques.
One advantage of the translucent plastic is that it can be illuminated from beneath with different colors. Indeed, the team used a projector connected to a laptop computer to beam an image on the model from below. The image above shows the campus colored according to the height of the buildings.
But that’s only the beginning of what they say is possible. To demonstrate, Weber and Gadepally filtered a portion of the Twitter stream to pick out tweets that were geolocated at the MIT campus. They can then use their model to show what kind of content is being generated in different locations on the campus and allow users to cut and dice the data using an interactive screen. “Other demonstrations may include animating twitter traffic volume as a function of time and space to provide insight into campus patterns or life,” they say.
Read the rest of this entry »