There are frequent media headlines about both the scarcity of machine learning talent and about the promises of companies claiming their products automate machine learning and eliminate the need for ML expertise altogether. In his keynote at the TensorFlow DevSummit, Google’s head of AI Jeff Dean estimated that there are tens of millions of organizations that have electronic data that could be used for machine learning but lack the necessary expertise and skills. I follow these issues closely since my work at fast.ai focuses on enabling more people to use machine learning and on making it easier to use.
Jim Hannan, executive vice president and CEO of enterprises for Koch Industries, saw it similarly, with Koch’s deep pockets helping to propel Infor in the future. “As a global organization spanning multiple industries across 60 countries, Koch has the resources, knowledge and relationships to help Infor continue to expand its transformative capabilities,” he said in a statement.
Before this latest wave of AI and machine learning interest and hype, organizations that had data-centric project needs also looked for methodologies that suited their goals. Emerging from roots in data mining and data analytics, some of these methodologies had at its core an iterative cycle focused on data discovery, preparation, modeling, evaluation, and delivery. One of the earliest of these developed is simply known as Knowledge Discovery in Databases (KDD). However, just like waterfall methodologies, KDD is in some ways too rigid or abstract to deal with continuously evolving models.
As organizations become increasingly reliant on technology, legacy systems become a roadblock to adaptability and growth. Business operations are built on a combination of enterprise and cloud technologies that are scaled to meet immediate needs but are often poor at accommodating pivot pricing. It’s difficult or impossible to predict future business needs but investing in closed systems that only yield immediate returns is a dead end.
"It’s important that the person understands what driving performance and change in a large organisation entails, so there is a requirement for a certain level of operating experience," he said.
The Institute of Electrical and Electronics Engineers (IEEE) is one of the largest organizations for computer scientists in the world. With hundreds of thousands of members, the group undertakes initiatives to create common standards and often consults organizations like the European Commission and OECD on matters of ethics and design principles.
“We started off with smaller units within agencies who were focused on being able to deal with cybercrime and money laundering,” Levin said. “What we found is that the different types of crime and illicit activity that these agencies need to be able to prevent means that our appeal has become much broader to those organizations, and our role has expanded.”
“We are partnering with other organizations to collect and distribute this kind of data. Some people will want their data to be openly available while others are interested in monetizing it. We are working within those requirements.”
So it is with transformations. As we’ve noted before, the term “transformation” can be vague, and it too often refers only to minor or isolated initiatives. What should define a transformation is in fact the opposite: an intense, well-managed, organization-wide program to enhance performance and to boost organizational health. And the results should always be measured.
Why is a tech backbone so critical to operations—to say nothing of a tech-enabled transformation? In short, the tech backbone manages the storage, aggregation, analysis, and provision of data across the organization. Companies with obsolete legacy systems face significant challenges in consolidating data from different sources, maintaining consistency and quality, and making the necessary updates within the required timelines. And because solutions based on analytics, AI, and the industrial IoT all rely on unfettered access to comprehensive, high-quality data to generate business insights, an inadequate tech backbone will severely limit the impact of these technologies.
In achieving modern delivery, the often disconnected elements - agile, DevOps, cloud, native architecture, testing, security, risk management, and compliance - are assembled into a unified model. Such an alignment and focus creates speed, reduces risk, and enhances quality across the entire organization to rapidly and continuously deliver value.
Cofactor is a large, structured listing of people, places, and things. Here you can find the description of each topic.