What to Read Next
Hearables are here: Apple held its “Special Event” and, among other things, officially killed the iPhone’s 3.5-millimeter earbud jack, replacing it with $159 wireless AirPods. My first reaction: Meh. But then I read Mike Elgan’s paean to this development in Computerworld.
Elgan says that AirPods are actually artificial intelligence hardware. “The biggest thing going on here is the end of ‘dumb speaker’ earbuds, and the mainstreaming of hearables — actual computers that go in your ears,” he says. “Bigger still is that the interface for these tiny computers is a virtual assistant. When you double-tap on an AirPod, Siri wakes up, enabling you to control music play and get battery information with voice commands.”
What does this mean for your company? Soon every employee could have a supercomputer whispering in his or her ear. For instance, Hearables startup Bragi and IBM just announced that they plan to combine Bragi’s Dash earbuds and IBM’s Watson IoT platform “to transform the way people interact, communicate, and collaborate in the workplace.”
Earbud-sporting workers, according to the companies, will use the devices to “receive instructions, interact with co-workers, and enable management teams to keep track of the location, operating environment, well-being, and safety of workers.” Bragi and IBM have targeted six areas of initial focus: worker safety, guided instructions, smart employee notifications, team communications, workforce analysis and optimization, and biometric ID.
Cybersecurity is every executive’s responsibility: The fact that Hillary Clinton’s email foibles have become a political football of Super Bowl proportions should be a wake-up call to business leaders: Their job descriptions include cybersecurity. “Defending against attacks is now a permanent part of senior executives’ job descriptions,” writes Bill Sweeney, CTO, Americas at BAE Systems, in Harvard Business Review. “It’s no longer enough to leave cybersecurity to annual reviews or a lone CISO. Senior executives must understand what procedures are in place and ensure that everyone in the organization understands protocol and takes accountability.”
Email Updates on Managing Tech
Get periodic email updates on how to incorporate new tech into your company’s strategy and operations.
Please enter a valid email address
Thank you for signing up
With Kaspersky Labs reporting that the average cost of an individual cyberattack on a large company now exceeds $800,000, this may seem obvious. Yet Sweeney points out that in a recent BAE Systems survey of 300 managers in the financial services, insurance, and IT/tech industries in the United States, 40% of the respondents admitted lacking “a clear understanding of the cybersecurity protocols within their organizations.”
If that describes you, Sweeney offers some valuable, practical advice: Find out what your company’s protocols are and periodically assess the residual risk of cyberattack; engage with and support your company’s chief information security officer; and promote cybersecurity — by means of employee education and training and by adopting and modeling secure practices yourself.
Diving into data lakes: Back in 2010, James Dixon, the founder and CTO of Pentaho, introduced the idea of a “data lake” into which you could collect multiple, diverse sources of unprocessed data into a single source of accessible data. The data lake has been and remains a key ingredient in making big data, well, big.
Unfortunately, as consultants Rash Gandhi, Sanjay Verma, Elias Baltassis, and Nic Gordon point out in a new Boston Consulting Group paper, data lakes come with a couple of challenges of their own. The first is the continuing lack of fully developed tools for securing data lakes and for ensuring data quality and validation. The second is a lack of data scientists who have data lake expertise.
The authors offer a four-step prescription for addressing these challenges and successfully developing and using data lakes. First, figure out the “highest-value use cases for big data” — whether that be insight generation, operational analytics, or transaction processing. Second, select the right operating model for implementation. Third, ensure data quality, security, and governance. Last, but certainly not least, “build an organization that is ready and able to make the best use of that data.” After all, there’s no point to building a data lake if no one’s going to swim in it.