Tag Archives: Cambridge

For Sale – 27” mid 2011 iMac i7 / 16gb For spares its repair

I was using it, then it stopped. I spoke to Apple Support and the booked me into Cambridge Store Genius. They ran a diagnostic and it passed all their tests and he suspected that it was a hard drive fail. It is classed as vintage and he says Apple would not repair it. I have already replaced with a new one so want this one gone. The guy in Apple removes the hard drive for me and that us not included. As for condition I can see no marks but he warned me that there may now be dust between glass and screen. Can take pictures if needed. It is boxed and comes with mouse only.

Go to Original Article
Author:

Blockchain solutions — and disruption — pondered at EmTech 2018

CAMBRIDGE — The World Bank, one of the most powerful financial institutions on the planet, is experimenting with blockchain as a tool to track agricultural goods and raise capital.

Gideon Lichfield, the editor in chief of the MIT Technology Review, found some irony in that.

“This technology that was invented by somebody whose true identity we still don’t know — Satoshi Nakamoto — specifically to take power away from financial institutions and put currency in the hands of the people is now being used by the ultimate, central, financial institution,” Lichfield told an audience at EmTech 2018, a conference focused on big data, artificial intelligence and technology.

The crowd gathered at MIT’s Media Lab had just heard from two thinkers in the increasingly mainstream field of blockchain, a method of distributed ledgers that can dramatically alter how transactions are made and verified.

Ledgers themselves date back to cuneiform records etched into tablets 7,000 years ago at the dawn of civilization, said Michael Casey, an author and senior advisor to the Digital Currency Initiative at Media Lab. If blockchain solutions decentralize financial ledgers in the future, that change could disrupt the flow of money into the world’s financial hubs. Using the 21st century version of the ledger, governments and other institutions could invest the money they save on financing in other causes.

The lack of trust in the record-keeping function has a huge impact on the world.
Michael Caseysenior advisor to the Digital Currency Initiative, MIT Media Lab

“If they could raise money more cheaply, you’d have a lot more funds to put into education, to put into health,” Casey said. “Why should [the cost of financing] go into the hands of a large investment bank when it could be going back to the poor?”

Blockchain solutions could also help the so-called underbanked and unbanked gain access to financial services. Distributed ledgers accrue credibility by replicating transaction records across a network of computers. Casey said that credibility could benefit people in places like Nairobi, Kenya, who have difficulty leveraging value from their real estate because banks distrust their property records.

“The lack of trust in the record-keeping function has a huge impact on the world,” he said.

The World Bank's Prema Shrikrishna and MIT Media Lab's Michael Casey discuss blockchain's potential at EmTech 2018.
The World Bank’s Prema Shrikrishna and MIT Media Lab’s Michael Casey discuss blockchain’s potential to provide a new model of trust at EmTech 2018.

World Bank experiments with blockchain solutions

The altruistic applications of blockchain were a focus of Casey’s EmTech talk with Prema Shrikrishna, who works on blockchain projects at World Bank Group.

Teaming up with the International Finance Corporation, the World Bank is currently designing a blockchain architecture to track oil palm from the farm to mills, where it becomes palm oil — an agricultural staple in everything from chocolate to candles. By tracking the origin of the raw material, most of which is produced in Indonesia, blockchain could reward farmers for sustainable practices, according to Shrikrishna.

Among other World Bank experiments with blockchain: 

Education. The World Bank is developing a system for rewarding students playing an educational game called Evoke, which is designed to teach skills for success in modern society, Shrikrishna said.

Vaccine management. In December, Oleg Kucheryavenko, a public health professional who works with the World Bank, wrote on the institution’s blog that blockchain could provide a “cost-effective solution” for vaccine distribution. Vaccines have a shelf-life, Kucheryavenko wrote, and the supply chain is “too complex to be taken for granted, with vaccines changing ownership from manufacturers to distributors, re-packagers and wholesalers before reaching its destination.”

Financing. In August, the World Bank sold blockchain-enabled bonds through the Commonwealth Bank of Australia, which raised about $80.5 million, according to Reuters.

Blockchain’s best use cases

Members of the audience at the talk had varying aspirations for blockchain’s use.

Rahul Panicker, chief innovation officer at Wadhwani Institute for Artificial Intelligence, which focuses on technological solutions to large-scale societal problems, believes blockchain can be harnessed for humanitarian causes.

“It was very encouraging to see an organization like the World Bank being willing to look at these frontier technologies, and especially a technology like blockchain that has the ability to reduce friction in the financial system,” said Panicker, after attending the talk. “The whole purpose of blockchain is actually to minimize the burden of trust. The cost of trust is especially high in the developing world, so the fact that organizations like the World Bank are willing to look at this can mean big things for the disempowered.”

Tom Hennessey, an attendee, posited that financial settlement was the most readily available application.

Tomas Jansen, of Belgium’s Federal Agency for the Reception of Asylum Seekers, said a lot of refugees arrive in Europe without identification papers because they belong to a marginalized group or lost their documents. Jansen wanted to hear ideas from the blockchain experts on how to address those problems.

Shrikrishna sidestepped the political ramifications, but she noted that World Bank has a program called Identification for Development that is working on integrating ID databases and creating an identity that would be “portable across borders.”

She said the World Bank is “technology agnostic” in seeking to solve problems around the globe, and stressed that the financial institution’s approach with blockchain has been both “very cautious” and “very experimental.”

Blockchain disruption

World Bank is hardly alone in its exploration of blockchain solutions to solve problems and change how business is done. Analysts expect blockchain to have a major impact on businesses, which are eyeing its potential to manage supply chains, verify documents, and trade securities. The firm Gartner estimates blockchain will add $3.1 trillion to the world economy by 2030. Some industry sectors have been quicker than others to start experimenting.

Describing blockchain as at an “inflection point,” a recent report by the consultancy Deloitte found that financial services executives are “leading the way in using blockchain to reexamine processes and functions that have remained static for decades,” and emerging players are using blockchain to challenge traditional business models.

Meanwhile, blockchain’s most developed use case — bitcoin — is driving most of the interest in the technology, while taking those invested in the cryptocurrency on a roller coaster ride.

So far development of a “stable coin” has been a “difficult nut to crack,” according to Casey, who used to cover currencies for The Wall Street Journal.

To stabilize the tender, a coin could be pegged to other metrics, or it could be backed by a reserve of funds to try to create more stability, Casey said. One way or another, he predicted, developers will find success.

“Something’s going to work. Something’s going to break as well,” Casey said.

Machine intelligence could benefit from child’s play

CAMBRIDGE — Current progress in machine intelligence is newsworthy, but it’s often talked about out of context. MIT’s Josh Tenenbaum described it this way: Advances in deep learning are powering machines to accurately recognize patterns, but human intelligence is not just about pattern recognition, it’s about modeling the world.

By that, Tenenbaum, a professor of cognitive science and computation, was referring to abilities that humans possess such as understanding what they see, imagining what they haven’t seen, problem-solving, planning, and building new models as they learn.

That’s why, in the interest of advancing AI even further, Tenenbaum is turning to the best source of information on how humans build models of the world: children.

“Imagine if we could build a machine that grows into intelligence the way a person does — that starts like a baby and learns like a child,” he said during his presentation at EmTech 2018, an emerging technology conference hosted by MIT Technology Review.

Tenenbaum called the project a “moonshot,” one of several that researchers at MIT are exploring as part of the university’s new MIT Quest for Intelligence initiative to advance the understanding of human and machine intelligence. The “learning moonshot” is a collaborative effort by MIT colleagues, including AI experts, as well as those in early childhood development and neuroscience. The hope is to use how children learn as a blueprint to build a machine intelligence that’s truly capable of learning, Tenenbaum said.

The “quest,” as it’s aptly labeled, won’t be easy partly because researchers don’t have a firm understanding for how learning happens, according to Tenenbaum. In the 1950s, Alan Turing, father the Turing test to analyze machine intelligence, presumed a child’s brain was simpler than an adult’s and that it was akin to a new notebook full of blank pages.

“We’ve now learned that Turing was brilliant, but he got this one wrong,” Tenenbaum said. “And many AI researchers have gotten this wrong.”

Child’s play is serious business

Instead, research such as that done by Tenenbaum’s colleagues suggests that newborns are already programmed to see and understand the world in terms of people, places and things — not just patterns and pixels. And that children aren’t passive learners but instead are actively experimenting, interacting with and exploring the world around them.

Imagine if we could build a machine that grows into intelligence the way a person does — that starts like a baby and learns like a child.
Josh Tenenbaumprofessor of cognitive science and computation, MIT

“Just like science is playing around in the lab, children’s play is serious business,” he said. “And children’s play may be what makes human beings the smartest learners in the known universe.”

Tenenbaum described his job as identifying insights like these and translating them into engineering terms. Take common sense, for example. Kids are capable of stacking cups or blocks without a single physics lesson. They can observe an action they’ve never seen before and yet understand the desired outcome and how to help achieve that outcome.

In an effort to codify common sense, Tenenbaum and his team are working with new kinds of AI programming languages that leverage the pattern recognition advances by neural networks, as well as concepts that don’t fit neatly into neural networks. One example of this is probabilistic inference, which enables machines to use prior events to predict future events.

Game engines open window into learning

Tenenbaum’s team is also using game engines. These are used to simulate a player’s experience in real time in a virtual world. Common game engines include the graphics engine used to design 2D and 3D images, and the physics engine, which transposes the laws of physics from the real to the virtual world. “We think they provide first approximations to the kinds of basic commonsense knowledge representation that are built into even the youngest children’s brains,” he said.

He said the game engines, coupled with probabilistic programming, capture data that helps researchers understand what a baby knows at 10 months or a year old, but the question remains: How does a baby learn how to build engines like these?

“Evolution might have given us something kind of like game engine programs, but then learning for a baby is learning to program the game engine to capture the program of their life,” he said. “That means learning algorithms have to be programming algorithms — a program that learns programs.”

Tenenbaum called this “the hard problem of learning.” To solve it, he’s focused on the easier problem of how people acquire simple visual concepts such as learning a character from a new alphabet without needing to see it a thousand times. Using Bayesian program learning, a machine learning method, researchers have been able to program machines to see an output, such as a character, and deduce how the output was created from one example.

It’s an admittedly small step in the larger landscape of machine intelligence, Tenenbaum said. “But we also know from history that even small steps toward this goal can be big.”

60 seconds with … Cambridge Research Lab Director Chris Bishop

Microsoft’s Research Lab in Cambridge UK, back when the lab was first opened in 1997, before being named Lab Director two-and-a-half years ago, so I’ve been involved in growing and shaping the lab for more than two decades. Today my role includes leadership of the MSR Cambridge lab, as well as coordination of the broader Microsoft presence in Cambridge. I am fortunate in being supported by a very talented leadership team and a highly capable and motivated team of support staff.

What were your previous jobs?

My background is in theoretical physics. After graduating from Oxford, I did a PhD in quantum field theory at the University of Edinburgh, exploring some of the fundamental mathematics of matter, energy, and space-time. After my PhD I wanted to do something that would have potential for practical impact, so I joined the UK’s national fusion research lab to work on the theory of magnetically confined plasmas as part of a long-term goal to create unlimited clean energy. It was during this time that there were some breakthroughs in the field of neural networks. I was very inspired by the concept of machine intelligence, and the idea that computers could learn for themselves. Initially I started applying neural networks to problems in fusion research, and we became the first lab to use neural networks for real-time feedback control of a high-temperature fusion plasma.

In fact, I found neural networks so fascinating that, after about eight years working on fusion research, I took a rather radical step and switched fields into machine learning. I became a Professor at Aston University in Birmingham, where I set up a very successful research lab. Then I took a sabbatical and came to Cambridge for six months to run a major, international programme called “Neural Networks and Machine Learning” at the Isaac Newton Institute. The programme started on July 1, 1997, on the very same day that Microsoft announced it was opening a research lab in Cambridge, its first outside the US. I was approached by Microsoft to join the new lab, and have never looked back.

What are your aims at Microsoft?

My ambition is for the lab to have an impact on the real world at scale by tackling very hard research problems, and by leveraging the advantages and opportunities we have as part of Microsoft. I often say that I want the MSR Cambridge lab to be a critical asset for the company.

I’m also very passionate about diversity and inclusion, and we have introduced multiple initiatives over the last year to support this. We are seeing a lot of success in bringing more women into technical roles in the lab, across both engineering and research, and that’s very exciting to see.

What’s the hardest part of your job?

A core part of my job is to exercise judgment in situations where there is no clear right answer. For instance, in allocating limited resources I need to look at the risk, the level of investment, the potential for impact, and the timescale. At any one time there will be some things we are investing in that are quite long term but where the impact could be revolutionary, along with other things that have perhaps been researched for several years which are beginning to get real traction, all the way to things that have had real-world impact already. The hardest part of my job is to weigh up all these factors and make some difficult decisions on where to place our bets.

What’s the best part of your job?

The thing I enjoy most is the wonderful combination of technology and people. Those are two aspects I find equally fascinating, yet they offer totally different kinds of challenges. We, as a lab, are constantly thinking about technology, trends and opportunities, but also about the people, teams, leadership, staff development and recruitment, particularly in what has become a very competitive talent environment. The way these things come together is fascinating. There is never a dull day here.

What is a leader?

I think of leadership as facilitating and enabling, rather than directing. One of the things I give a lot of attention to is leadership development. We have a leadership team for the lab and we meet once a week for a couple of hours. I think about the activities of that team, but also about how we function together. It’s the diversity of the opinions of the team members that creates a value that’s greater than the sum of its parts. Leadership is about harnessing the capabilities of every person in the lab and allowing everyone to bring their best game to the table. I therefore see my role primarily as drawing out the best in others and empowering them to be successful.

What are you most proud of?

Last year I was elected a Fellow of the Royal Society, and that was an incredibly proud moment. There’s a famous book I got to sign, and you can flip back and see the signatures of Isaac Newton, Charles Darwin, Albert Einstein, and pretty much every scientist you’ve ever heard of. At the start of the book is the signature of King Charles II who granted the royal charter, so this book contains over three-and-a-half centuries of scientific history. That was a very humbling but thrilling moment.

Another thing I’m very proud of was the opportunity to give the Royal Institution Christmas Lectures. The Royal Institution was set up more than 200 years ago – Michael Faraday was one of the early directors – and around 14 Nobel prizes have been associated with the Institution, so there is a tremendous history there too. These days it’s most famous for the Christmas Lectures, which were started by Faraday. Ever since the 1960s these lectures have been broadcast on national television at Christmas, and I watched them as a child with my mum and dad. They were very inspirational for me and were one of the factors that led me to choose a career in science. About 10 years ago, I had the opportunity to give the lectures, which would have been inconceivable to me as a child. It was an extraordinary moment to walk into that famous iconic theatre, where Faraday lectured many times and where so many important scientific discoveries were first announced.

One Microsoft anecdote that relates to the lectures was that getting selected was quite a competitive process. It eventually came down to a shortlist of five people, and I was very keen to be chosen, especially as it was the first time in the 200 year history of the lectures that they were going to be on the subject of computer science. I was thinking about what I could do to get selected, so I wrote to Bill Gates, explained how important these lectures were and asked him whether, if I was selected, he would agree to join me as a guest in one of the lectures. Fortunately, he said yes, and so I was able to include this is my proposal to the Royal Institution. When I was ultimately selected, I held Bill to this promise, and interviewed him via satellite on live television during one of the lectures.

Chris Bishop is elected a Fellow of the Royal Society

What inspires you?

I love the idea that through our intellectual drive and curiosity we can use technology to make the world a better place for millions of people. For example, the field of healthcare today largely takes a one-size-fits-all approach that reactively waits until patients become sick before responding, and which is increasingly associated with escalating costs that are becoming unsustainable. The power of digital technology offers the opportunity to create a data-driven approach to healthcare that is personalised, predictive and preventative, and which could significantly reduce costs while also improving health and wellbeing. I’ve made Healthcare AI one of the focal points of the Cambridge lab, and I find it inspiring that the combination of machine learning, together with Microsoft’s cloud, could help to bring about a much-needed transformation in healthcare.

What is your favourite Microsoft product?

A few years ago, the machine learning team here in Cambridge built a feature, in collaboration with the Exchange team, called Clutter. It sorts out the email you should pay attention to now, from the ones that can be left to, say, a Friday afternoon. I love it because it’s used by tens of millions of people, and it has some very beautiful research ideas at the heart of it – something called a hierarchical Bayesian machine learning model. This gives it a nice out-of-the-box experience, a sort of average that does OK for everybody, but as you engage with it, it personalises and learns your particular preferences of what constitutes urgent versus non-urgent email. The other reason I’m particularly fond of it is that when I became Lab Director, the volume of email in my inbox quadrupled. That occurred just as we were releasing the Clutter feature, so it arrived just in time to save me from being overwhelmed.

What was the first bit of technology that you were excited about?

When I was a child I was very excited about the Apollo moon landings. I was at an age where I could watch them live on television and knew enough to understand what an incredible achievement they were. Just think of that Saturn launch vehicle that’s 36 storeys high, weighs 3,000 tonnes, is burning 15 tonnes of fuel a second, and yet it’s unstable. So, it must be balanced, rather like balancing a broom on your finger, by pivoting those massive engines backwards and forwards on hydraulic rams in response to signals from gyroscopes at the top of the rocket. It’s that combination of extreme brute force with exquisite precision, along with dozens of other extraordinary yet critical innovations, that made the whole adventure just breath-taking. And the filtering algorithms used by the guidance system are an elegant application of Bayesian inference, so it turns out that machine learning is, literally, rocket science.

Tags: , , , , ,

Data analytics in government efforts lack structure

CAMBRIDGE, Mass. — The U.S. government is adept at collecting massive amounts of data. Efforts to deploy data analytics in government agencies, however, can be weak and disorganized.

At some agencies, officials say there’s a lack of a cohesive system for government analytics and management.

“I recently learned that we have no real concept of data archiving, and data backup and protection,” said Bobby Saxon, CTO at the Centers for Medicare & Medicaid Services (CMS).

“We have archived everything in every place,” Saxon said. “It’s really just wasted data right now.”

Data analytics struggles

Speaking on a panel about data analytics in government at the annual MIT Chief Data Officer and Information Quality (CDOIQ) Symposium at the university’s Tang Center, Saxon spoke on the struggles his agency has with analytics.

CMS, finally moving out of crisis mode after dealing with widely publicized IT problems with its healthcare.gov website, has an “OK” structure for data analytics and management, Saxon said.

While Saxon said he and his colleagues are working to improve the situation, currently the organization tends to rely on outside vendors to deal with difficult and pressing analytics problems.

“In the world of predictive analytics, typically the average vendor or subject expert will ask what are your questions, and go off and try to solve questions for you, and then ask if you have any more questions,” Saxon said.

Panelists at the annual MIT CDOIQ Symposium in Cambridge, Mass.
Left to right: Bobby Saxon, CTO, Centers for Medicare & Medicaid Services; John Eltinge, U.S. Census Bureau; and Mark Krzysko of the Department of Defense at the annual MIT CDOIQ Symposium in Cambridge, Mass.

Outside help costly

Ultimately, while government analytics problems tend to be fixed to some extent, the IT corrections solutions can take weeks, and often simply are too expensive in the long term, Saxon explained.

I recently learned that we have no real concept of data archiving, and data backup and protection.
Bobby SaxonCTO, Centers for Medicare & Medicaid Services

In addition, employees aren’t learning additional data analytics in government techniques, and can’t immerse themselves in the problems at hand and actually be able to discover the root issues of what might be going wrong.

Panel moderator Mark Krzysko of the Department of Defense’s Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, noted a similar problem in his agency.

Krzysko said he had honed a personal strategy in his early years with the agency: “Use the tools they’ve given you.”

When a data dilemma arose, often he might see employees making calls to the Air Force or the Army for answers, instead of relying on their own government analytics tools, he said.

The panel, “Data Analytics to Solve Government Problems,” was part of the 12th Annual MIT CDOIQ Symposium, held July 18 to 20.

The panel also included John Eltinge of the United States Census Bureau.

Focus, scope and spotting opportunity are key to role of CDO

CAMBRIDGE, Mass. — In the age of big data, the opportunities to change organizations by using data are many. For a newly minted chief data officer, the opportunities may actually be too vast, making focus the most essential element in the role of CDO.

“It’s about scope,” said Charles Thomas, chief data and analytics officer at General Motors. “You struggle if you do too many things.”

As chief data officer at auto giant GM, Thomas is focusing on opportunities to repackage and monetize data. He called it “whale hunting,” meaning he is looking for the biggest opportunities.

Thomas spoke as part of a panel on the role of CDO this week at the MIT Chief Data Officer and Information Quality Symposium.

At GM, he said, the emphasis is on taking the trove of vehicle data available from today’s highly digitized, instrumented and connected cars. Thomas said he sees monetary opportunities in which GM can “anonymize data and sell it.”

The role of CDO is important, if not critical, Thomas emphasized in an interview at the event.

The nurturing CDO

“Companies generate more data than they use, so someone has to approach it from an innovative perspective — not just for internal innovation, but also to be externally driving new income,” he said. “Someone has to [be] accountable for that. It has to be their only job.”

“A lot of industries are interested in how people move around cities. It’s an opportunity to sell [data] to B2B clients,” Thomas added.

Focus is also important in Christina Clark’s view of the role of CDO. But nurturing data capabilities across the organization is the initial prime area for attention, said Clark, who is CDO at industrial conglomerate General Electric’s GE Power subsidiary and was also on hand as an MIT symposium panelist.

Every company should get good at aggregating, analyzing and monetizing data, Clark said.

“You then look at where you want to focus,” she said. The role of CDO, she added, is likely to evolve according to the data maturity of any given organization.

Focusing on data areas in which an organization needs rounding out was also important to symposium panelist Jeff McMillan, chief analytics and data officer at Morgan Stanley’s wealth management unit, based in New York.

The chief data officer role evolution
As the role of CDO changes, it’s becoming more strategic.

It’s about the analytics

“Organizations say, ‘We need a CDO,’ and then bring them in, but they don’t provide the resources they need to be successful,” he said. “A lot of people define the CDO role before they define the problem.”

It’s unwise to suggest a CDO can fix all the data problems of an organization, McMillan said. The way to succeed with data is to drive an understanding of data’s value as deeply into the organization as possible.

“That is really hard, by the way,” he added. At Morgan Stanley, McMillan said, his focus in the role of chief data officer has been around enabling wider use of analytics in advising clients on portfolio moves.

All things data and CDO

Tom Davenport, BabsonTom Davenport

Since widely materializing in the aftermath of the 2008 financial crisis, the role of CDO has been seen largely as seeking consensus.

Compliance and regulation tasks have often blended in a broad job description that has come to include big data innovation initiatives. But individual executives’ refinements to chief data officer approaches may be the next step for the role of CDO, longtime industry observer and Babson College business professor Tom Davenport said in an interview.

“Having someone responsible for all things data is not a workable task. So, you really need to focus,” Davenport said. “If you want to focus on monetization, that’s fine. If you want to focus on internal enablement or analytics, that’s fine.”

The advice to the would-be CDO is not unlike that for most any other position. “What you do must be focused; you can’t be all things to all people,” Davenport said.

Airbnb, Univision highlight best practices in BI

CAMBRIDGE, Mass. — For multibillion-dollar industry giants like Airbnb and Univision Communications, best practices in BI help drive growth.

BI at Airbnb

Data streams in constantly at Airbnb, and top executives want insights daily, said Theresa Johnson, a data scientist and products manager for financial infrastructure at the San Francisco-based company, during a panel at the 2018 Real BI Conference.

To handle demands, Johnson helped develop an easier system for creating and linking metrics, and she fosters a workplace environment that has her data science team focusing on collaboration.

Airbnb largely uses its own set of tools for sorting data, Johnson said. But she said one of the best practices in BI she used was to create a scalable system, with linked metrics about the online market for the private vacation housing rentals that could be looked at and edited on the go with “slice-and-dice” tactics.

The system has its faults, as new metrics that harbor errors could inadvertently shut down large chunks of the system when it was introduced. Recognizing that, Johnson said she established a developer environment in which new metrics can be tested safely before linking with the system.

Theresa Johnson, data scientist on supply growth at AirbnbTheresa Johnson

Johnson also developed scalable and repeatable tools to easily link new metrics without having to do the bulk of the work manually, which she said saves time and energy.

“One thing I learned from fellow designers is that you want to preference familiarity,” Johnson said. “You don’t want people to come to your tool and have to struggle [with] it.”

For these types of tools, “the more work that it is to use, the less likely I am to use it,” she said.

Data transparency

Data at Airbnb is also transparent, according to Johnson.

A version of final reports on the data is accessible by anyone in the company, and it’s used weekly by certain key stakeholders.

Airbnb has about 4,000 employees, and most of them look at company reports, Johnson said.

Daily active users are far fewer, “but everyone in the company cares; everyone in the company should be informed,” she said, highlighting one of the best practices in BI.

Using an open and transparent platform inspires care and accountability, she noted. So, “don’t be secretive.”

Making data accessible

Simone Knight, Univision CommunicationsSimone Knight

Univision Communications has a similar philosophy, noted Simone Knight, vice president of marketing and media intelligence at the New York-based media giant, which targets Hispanic Americans, in her panel talk.

Knight also cited the need for an easy-to-use and open system.

Listing some of the best practices in BI, she suggested setting up a dashboard on an “always-on display,” preferably on a large touchscreen. With a setup like that, she said, people will be more willing to look at the data and will be able to more easily dig into it.

In meetings, she said, “stop PowerPoints.” Use a live dashboard instead, which will be more interactive and feature the freshest information.

Storytelling as a best practice in BI

Noting the importance of telling a story with the data, especially in the marketing world, Knight said it’s best to try to weave stories into new products or tools when they launch, to better hook a customer and establish a user base. Storytelling can be important, too, when trying to explain data insights to management.

Knight said she often talks her 7-year-old daughter through the data first before presenting something during a meeting.

“If I can find a way to simplify it with her, I can find a way to simplify it with anyone,” she said.

She also noted the growing relevance of automated BI systems, which can save employees valuable time and cut costs.

Everyone in the company cares; everyone in the company should be informed [about data].
Theresa Johnsondata scientist and products manager for financial infrastructure at Airbnb

In addition to BI, Univision has begun using predictive analytics for determining the potential ratings of upcoming television shows. Using an algorithm, the company can predict ratings on premieres, taking into account the time of year, other shows currently being run at that time and how well past premieres have done.

For its part, Airbnb has its own “fully automated” framework in place for sorting through and digging into data, Johnson said in her own take on best practices in BI.

“We’ve set up a framework to make sure that we’re ingesting the right data, that we’re combining the data in ways that are meaningful, and that we’re surfacing regularly and reliably,” Johnson said.

It took a “bit of infrastructure” to make it automated, she said, but now saves on time, cost and energy.

The Real BI Conference was held June 27 to 28 at the MIT Tang Center.

The future of BI is bright, as experts note evolving trends

CAMBRIDGE, Mass. — While many are curious about the future of BI, here at a business intelligence conference, experts made clear it’s impossible to predict that future.

The best most of us can do is chart the past, keep an eye on the present and follow trends to suggest what might come next. That’s a point futurist Amy Webb and researcher Howard Dresner made quite clear in their keynotes at Dresner’s Real Business Intelligence Conference, which took place here this week.

Identifying trends is more of a science than an art, according to Webb, founder and CEO of the Future Today Institute and a professor at the New York University Stern School of Business.

Futurist sees AI in BI

As a futurist, Webb advises businesses and governments on technology trends. She said the advice she offers is “grounded in data.”

Every year, Webb publishes her Emerging Tech Trends Report. With close to 250 trends in this year’s edition, it covers topics ranging from the current state and future of BI and data management tools to emerging applications for various AI tools.

Speaking about AI, Webb noted the potential importance tools like machine learning systems could have for businesses.

Machine learning, in automating parts of data analysis, theoretically can better position a business across all of its branches — from sales to marketing to customer service — based on how much data it has available, Webb said.

Emerging tools like deep learning — which functions in a similar way, but can ideally develop more important inferences with less data — have the potential to change the way businesses house their data and how their data retention policies are structured, Webb continued.

Amy Webb, founder and CEO of the Future Today Institute
Amy Webb

Small group of AI giants

Smart features will increasingly creep into many solutions.
Howard Dresnerchief research officer, Dresner Advisory Services

Now, Webb said there are a total of nine companies “driving the entire culture of AI,” including Microsoft, Google and Amazon. She added that she expects the AI sector to consolidate even more.

In addition to their large budgets and powerful AI tools, these AI giants also maintain large-scale data storage tools and possess robust cloud platforms.

With the future of BI and analytics moving toward more cloud storage and automated analysis, Webb said she would expect to see a business “quickly moving to pick which system it will connect with” and flexibility becoming more limited.

Cloud critical to BI

Meanwhile, in his opening keynote at the conference, Dresner highlighted the results of surveys on the state of BI by his firm, Dresner Advisory Services, where he is chief research officer.

Dresner touched on the growing force of cloud computing. Since 2016, when its future was still somewhat uncertain, the cloud has “very much become mainstream,” he said.

Howard Dresner, chief research officer, Dresner Advisory ServicesHoward Dresner

AI technologies are developing quickly, too, he said, although perhaps not as fast as its widespread publicity would make it appear. “Smart features will increasingly creep into many solutions,” Dresner said.

Over the last few years, vendors have made big strides in providing software as a service, which has now become fairly common. While dashboards and reporting are still the main priorities of BI teams, emerging technology and tools, such as edge computing and text analytics, are gaining in significance, he said.

In general, the future of BI looks positive, he said, noting that BI budgets are increasing year by year.

Editor’s note: TechTarget, publisher of SearchBusinessAnalytics, is a media partner of the event.

MIT CIO: What is digital culture, why it’s needed and how to get it

CAMBRIDGE, Mass. — Can large organizations adopt the digital cultures of 21st century goliaths like Amazon and Google? That was the question posed in the kickoff session at the recent MIT Sloan CIO Symposium.

The assumption — argued by panel moderator and MIT Sloan researcher George Westerman — is that there is such a thing as a digital culture. Citing MIT Sloan research, Westerman said it includes values like autonomy, speed, creativity and openness; it prevails at digital native companies whose mission is nothing less than to change the world; and it’s something that “pre-digital” companies need too — urgently.

Digital technologies change fast, Westerman said; organizations much less so. But as digital technologies like social, mobile, AI and cloud continue to transform how customers behave, organizational change is imperative — corporate visions, values and practices steeped in 20th century management theories must also be adapted to exploit digital technologies, or companies will fail.

“For all the talk we’ve got about digital, the real conversation should be about transformation,” said Westerman, principal research scientist at the MIT Initiative on the Digital Economy. “This digital transformation is a leadership challenge.”

Marrying core values with digital mechanisms

Creating a digital culture is not just about using digital technology or copying Silicon Valley companies, Westerman stressed. He said he often hears executives say that if they just had the culture of a Google or Netflix, their companies could really thrive.

George Westerman, principal research scientist, MIT Initiative on the Digital EconomyGeorge Westerman

“And I say, ‘Are you sure you want that?’ That means you’ve got to hire people that way, pay them that way and you might need to move out to California. And frankly a lot of these cultures are not the happiest places to work,” Westerman said. And some can even be downright toxic, he noted, alluding to Uber’s problems with workplace culture.

The question for predigital companies then is not if they can adopt a digital culture but how do they create the right digital culture, given their predigital legacies, which include how their employees want to work and how they want to treat employees. The next challenge will be infusing the chosen digital culture into every level of the organization.

Corporate values are important, but culture is what happens when the boss leaves the room, Westerman said, referencing his favorite definition.

David Gledhill, group CIO and head of group technology and operations, DBS Bank  David Gledhill

“The practices are what matters,” he told the audience of CIOs, introducing a panel of experts who served up some practical advice.

Here are some of the digital culture lessons practiced by the two IT practitioners on the panel, David Gledhill, group CIO and head of group technology and operations at financial services giant DBS Bank, and Andrei Oprisan, vice president of technology and director of the Boston tech hub at Liberty Mutual Insurance, the diversified global insurer.

Liberty Mutual’s Andrei Oprisan: ‘Challenging everything’

Mission: Oprisan, who was hired by Liberty Mutual in 2017 to fix core IT systems and help unlock the business value in digital systems, said the company’s digital mission is clear and clearly understood. “We ask ourselves, ‘Are we doing the best thing for the customer in every single step we’re taking?'”

Andrei Oprisan, vice president of technology and director of the Boston tech hub, Liberty Mutual InsuranceAndrei Oprisan

The mission is also urgent, because not only are insurance competitors changing rapidly, he said, but “we’re seeing companies like Amazon and Google entering the insurance space.”

“We need to be able to compete with them and beat them at that game, because we do have those core competencies, we do have a lot of expertise in this area and we can build products much faster than they can,” he said.

Outside talent: Indeed, in the year since he was hired, Oprisan has scaled the Boston tech hub’s team from eight developers to over 120 developers, scrum masters and software development managers to create what he calls a “customer-centric agile transformation.” About a quarter of the hires were from inside the organization; the rest were from the outside.

Hiring from the outside was a key element in creating a digital culture in his organization, Oprisan said.

“We infused the organization with a lot of new talent to help us figure out what good looks like,” he said. “So, we’re only trying to reinvent ourselves and investing in our own talent and helping them improve and giving them all the tools they need, but we also add talents to that pool to change the way we’re solving all of these challenges.”

Small empowered teams: In the quest to get closer to the customer, the organization has become “more open to much smaller teams owning business decisions end to end,” he said, adding that empowering small teams represented a “seismic shift for any organization.” Being open to feedback and being “OK with failure” — the sine qua non of the digital transformation — is also a “very big part of being able to evolve very quickly,” he said.

“We’re challenging everything. We’re looking at all of our systems and all of our processes, we’re looking at culture, looking at brands, looking at how we’re attracting and retaining talent,” he said.

T-shirts and flip-flops: Oprisan said that autonomy and trust are key values in the digital culture he is helping to build at Liberty’s Boston tech hub.

“We emphasize that we are going to give them very challenging, hard problems to solve, and that we are going to trust they know how to solve them,” he said. “We’re going to hire the right talent, we’re going to give you a very direct mission and we’re going to get out of the way.”

In fact, Oprisan’s development teams work across the street from the company’s Boston headquarters, and they favor T-shirts and flip-flops over the industry’s penchant for business attire, he said — with corporate’s blessing. “Whatever it takes to get the job done.”

DBS Bank CIO David Gledhill: ‘Becoming the D in Gandalf’

Mission: Gledhill, the winner of the 2017 MIT Sloan CIO Leadership Award and a key player in DBS Bank’s digital transformation, said the digital journey at Singapore’s largest bank began a few years ago with the question of what it would take to run the bank “more like a technology company.”

Bank leadership studied how Google, Amazon, Netflix, Apple, LinkedIn and Facebook operated “at a technology level but also at a culture level,” he said, analyzing the shifts DBS would have to make to become more like those companies. In the process, Gledhill hit upon a slogan: DBS would strive to become the “D” in Google-Amazon-Netflix-Apple-LinkedIn-Facebook (GANALF). “It seems a little cheesy … but it just resonated so well with people.”

Cheesiness aside, the wizardry involved in becoming the “D” in Gandalf, has indeed played out on a technology and human level, according to Gledhill. Employees now have “completely different sets of aspirations” about their jobs, a change that started with the people in the technology units and spread to operations and the real state unit. “It was really revolutionary. Just unlocking this interest in talent and desire in people has taken us to a completely new level of operation.”

Gledhill is a fan of inspirational motifs — another DBS slogan is “Making banking joyful” — but he said slogans are not sufficient to drive digital transformation. He explained that the collective embrace of a digital culture by DBS tech employees was buttressed by five key operational tenets. (He likened the schema to a DBS version of the Trivial Pursuit cheese wheel.) They are: 1. Shift from project to platform; 2. Agile at scale; 3. Rethinking the organization; 4. Smaller systems for experimentation; 5. Automation.

Platform not projects, Agile: “Rather than having discrete projects that need budget and financing and committees and all that stuff, we got rid of all that,” Gledhill said. In its place, DBS has created and funded platforms with specific capabilities. Management describes the outcomes for teams working on the platforms. For example, goals include increasing the number of customers acquired digitally, or increasing digital transactions. But it does not prescribe the inputs, setting teams free to achieve the goals. That’s when “you can really start performing Agile at scale,” he said.

Rethink, rebuild, automate: DBS’s adoption of a digital culture required rethinking organizational processes and incentives. “We call it ‘organized for success’ on the cheese wheel, which is really about DevOps, business and tech together, and how you change the structure of the KPIs and other things you use to measure performance with,” he said.

On the engineering side, DBS now “builds for modern systems,” he said. That translates into smaller systems built for experimentation, for A/B testing, for data and for scaling. “The last piece was automation — how do you automate the whole tech pipeline, from test to build to code deploy,” Gledhill said.

“So those five cheeses were the things we wanted everybody to shift to — and that included open source and other bits and pieces,” he said. “On the outer rim of the five cheeses, each one had a set of maybe five to 10 discrete outputs that had to change.”

One objective of automating every system was to enable DBS to get products to market faster, Gledhill said. “We have increased our release cadence — that is, the number of times we can push into a dev or production environment — by 7.5 times. That’s a massive increase from where we started.”

Editor’s note: Look for detailed advice on how to create a digital culture from experts at McKinsey & Company and Korn Ferry in part two of this story later this week.

Why your KPI methodology should use ‘right brain’ words

CAMBRIDGE, Mass. — Despite the appeal of trendy technologies like artificial intelligence, one consultant is encouraging CIOs to go back to business intelligence basics and rethink their key performance indicator methodology.

Mico Yuk, CEO and co-founder of the consultancy BI Brainz Group in Atlanta, said companies are still struggling to make key performance indicators actionable — and not for lack of trying. It turns out the real stumbling block isn’t data, it’s language.

“KPIs are a challenge because of people, not because of measurements. A lot of problems that exist with KPIs are in the way that people interpret them,” she said in an interview at the recent Real Business Intelligence Conference.

Yuk sat down with SearchCIO and talked about how her key performance indicator (KPI) methodology, known as WHW, breaks KPIs into simple components and why her research drove her to consider the psychological impact of KPI names. This Q&A has been edited for brevity and clarity.

You recommend teams should have at least three but no more than five KPIs. What’s the science behind that advice?

Mico Yuk, CEO and co-founder of BI Brainz, on KPIs and KPI methodology.Mico Yuk

Mico Yuk: There’s a book from Franklin Covey called The 4 Disciplines of Execution. It’s a fantastic book. In it, he talks about not having more than three WIGS — widely important goals — per team. He did a study and proved that over a long period of time — a year, three months, six months, the typical KPI timeframe for tracking, monitoring and controlling — human beings can only take action and be effective on three goals. Other research firms have said five to eight KPIs are important. Today, I tell people that most report tracking of KPIs is done on mobile devices. It’s been proven that human beings get over 30,000 ads per day, and half of those exist on their phones. You are constantly competing for people’s attention. With shorter attention spans, you have to be precise, you have to be exact, and when you have your user’s attention, you have to make sure they have exactly what they need to take action or you’ll lose them.

The KPI methodology you ascribe to is called WHW. What is WHW?

Yuk: WHW stands for What, How and When. We took Peter Drucker’s SMART concept. Everybody knows him. He’s the ‘If you can’t see it, then you can’t measure it’ guy. His methodology is called SMART, which stands for specific, measurable, accurate, results-oriented, and time-oriented. He says you have to have all five elements in your KPI in order for it to be useful. We said we’re going look at what Drucker was recommending, extract those elements and turn them into a sentence structure. To do this you take any KPI and ask yourself three questions: What do you need to do with it? That’s the W. By how much? That’s the H. By when? That’s the W. You use those answers to rename your KPI so that it reads like this: The action, the KPI name, the how much, and the when. That is SMART hacked.

Why do you find WHW to be a better KPI methodology?

Yuk: It’s easier. We don’t think one KPI methodology is necessarily better than the other. Using OKRs [Objectives and Key Results] are equally as effective, for instance. But we do find that having just a sentence where someone can plug in words is much faster. Imagine going to a company and saying, ‘You have 20 KPIs. We’re going to transform all of them.’ Some of the methodologies require quite a bit of work to get that done. We find that when we show companies a sentence structure and they are able to just answer, answer, answer and see the transformation, it’s an ‘ah-ha’ moment for them. Not to mention there’s the consumption part of it. Now that you’re specific, it also makes it easier to break that big goal down into smaller goals for people to consume.

You’ve said it’s important to rename KPIs, but the language you use is equally as important. What led you to that conclusion?

Yuk: We are data visualization specialists, but when we started nine years ago we found that [our visualizations] were becoming garbage in, garbage out. We kept saying, ‘This looks great, but it’s not effective. Why?’ We then [looked at] what we were visualizing, and we realized that the KPIs we were visualizing were the problem — not the type of charts, not the color, not the prettiness. That led us to say, ‘We’ve got to look at these KPIs closely and figure out how to make these KPIs smarter.’ That was our shared challenge. That led us into learning more about ‘right-brain wording,’ learning about simplicity, learning about exactly what the perfect KPI looks like after we evaluated as many methodologies as we could find on the market. What we concluded is that it all starts with your KPI name.

What is a “right-brain wording?”

Yuk: If you go online and you look up right brain versus left brain [wording], there are amazing diagrams. They show that your right brain controls your creativity while your left brain is more analytical. Most developers use the left side of their brains — analytics, coding, all that complex stuff. The artists of the world, the creatives who may not be able to understand complex math, they use the right part of their brain. But what you find on the creative side is that there is a cortex activity that happens when you use certain words that [are] visceral. We found that it is one thing to rename your KPIs, but it is another thing to get [the wording right] so that it resonates with people.

Let’s take profit margin as an example, and let’s say that after you use our WHW hack, the revised KPI name is ‘increase profit margin by 25% by 2017.’ If I were to ask you to visualize the word increase, you would probably see an arrow that points up. OK, it’s a visual but not one that you have an emotional connection to — it’s a left-brain, analytical word. But if I ask you to visualize a right-brain word like grow, I guarantee you’ll see a green leaf or plant in your brain. What happens in your brain is, because you’re thinking of a leaf, there’s an association that happens. Most people have a personal experience with the word grow — a memory of some kind. But they don’t have the same relationship with the word increase. Because of the association, users are more likely to remember and take action on that KPI. User adoption of KPIs and taking action is a problem. If you take the time to wordsmith or rename the KPIs so that they’re more right-brain-oriented, you can transform how your users act and react to them.

How can CIOs help make employees feel connected to top-line goals with KPIs?

Yuk: After we finish wordsmithing the KPI’s name, we focus on impact. A CIO in New York told me a long time ago, ‘One of the most important things you need to remember is that everybody has to be tuned into WIFM.’ And I asked, ‘What’s that?’ He said, ‘What’s in it for me?’

The good thing about transforming a KPI into the WHW format — it now has the action, the KPI name, the how much, the by when all in the name. You are now able to take that 25% [profit margin goal] and set deadlines and break it down, not just for the year, but by month, by quarter and even by day. You can break it down to the revenue streams that contribute [to the goal] and see what percentage those revenue streams contribute. That’s where you can individualize expectations and actions.

You tend to find two things. Not only can you individualize expectations, but you can also say, now that you have that individual goal, I can show you how it ties back into the overall goal and how other people are performing compared to you. People innately want to be competitive. They want to be on top — the leaderboard syndrome.

Those two elements are keys to having impact with your KPIs. Again, it’s a bit more psychological, but KPIs aren’t working. So we dug deep into the more cognitive side to try to figure out how to make them resonate with people and the [psychological] rabbit hole goes very deep. Start with the name.