New mobile machine learning capabilities are coming to Apple devices, thanks to IBM Watson and Apple Core ML.
Under CEO Tim Cook’s leadership, Apple has been angling for an ever-bigger slice of the enterprise pie. Last year we saw the technology giant partner with GE to bring the industrial predictive and analytics capabilities of the Predix IIoT platform to Apple’s iOS.
But for the past few years Apple has been deepening and extending its ties with IBM too. When the two companies announced their strategic partnership in 2014, Tim Cook and IBM CEO Virginia Rometty claimed that “Apple and IBM are like puzzle pieces that fit perfectly together.”
The agreement saw IBM transfer over 150 of their enterprise IT apps and tools to Apple platforms natively, and IBM selling iPhones and iPads to its business clients worldwide. Crucially, it gave Apple access to the business verticals that Microsoft has historically dominated.
Now the duo have added machine learning to the partnership, combining IBM Watson with Apple Core ML to bring new AI insights to the business apps on Apple devices.
Watson + mobile machine learning
Earlier this week we reported on IBM’s new Watson Assistant AI, and this latest announcement with Apple looks to further Watson’s capabilities. The initiative will benefit a class of products that numbers in the hundreds, sourced from the collaboration so far – across finance, insurance, energy, manufacturing, aviation, and beyond.
In leveraging the new technology, customers can build machine learning models using IBM Watson (the company’s cloud-based AI platform for business) and train it with their own industry-specific data. This includes the ability to create different machine learning models, compare the results, and run automated experiments – identifying patterns and gaining insights, to reach decisions more quickly.
Machine learning is implemented with IBM Watson’s visual modelling tools, such as PixieDust and Brunel, but there’s support for Jupyter notebooks with Python, R and Scala – plus the open-source RStudio. This is then converted to Apple’s Core ML to integrate it with Apple-compatible applications.
One such application of machine learning enables iPhone cameras to access Watson’s image recognition capabilities. Users can identify and classify content, before analysing it to extract detailed information. This capability could shake up workflows in the industrial, logistics, and healthcare sectors.
The machine learning algorithm will mature over time as apps feed data back to Watson in the cloud. Mahmoud Naghshineh, general manager for IBM Partnerships and Alliances explained, via TechCrunch:
That’s the beauty of this combination. As you run the application, it’s real time and you don’t need to be connected to Watson, but as you classify different parts [on the device], that data gets collected and when you’re connected to Watson on a lower [bandwidth] interaction basis, you can feed it back to train your machine learning model and make it even better.
Internet of Business says
By placing these sorts of automated, intelligent applications in the hands of enterprise workers, via their iPhones or iPads, IBM and Apple are enabling a more informed and mobile workforce. This has the potential to boost efficiency levels, collaboration, and decision-making.
While the two companies may seem like unlikely bedfellows, there was truth in the ‘puzzle piece’ analogy. In merging the design and UX pedigree that comes from Apple’s decades of consumer experience, with the IT expertise and vertical user-base of IBM, there are the makings of a force to breach fortress Microsoft in the business arena.