Algorithmia Looks to Meld MLOps and DevOps

Source:-devops

Algorithmia, in the latest update to the enterprise edition of its namesake machine learning operations (MLOps) platform, is enabling software development lifecycle practices to be applied to the building of algorithms by making it possible to debug them using desktop tools that are widely employed.

The enterprise edition of Algorithmia will enable users to write and run local tests for algorithms as shared local data files. Desktop developer tools that can now integrate with that process include PyCharm, Jupyter Notebooks, R Shiny, Android, iOS, Cloudinary, Datarobot and H2O.AI.

In addition, Algorithmia MLOps has extended its support for the latest graphical processor units (GPUs) available on Amazon Web Services (AWS) and Microsoft Azure.

Finally, the company has added support for AWS C2S, a private cloud for intelligence services, and AWS GovCloud.

Algorithmia CEO Diego Oppenheimer said IT teams building artificial intelligence (AI) applications are applying many of the same best DevOps practices employed elsewhere in the organization. Making it possible to debug those applications on a local desktop makes it easier for organizations to construct those DevOps workflows, he said.

The Algorithmia platform enables IT teams to control the provenance of all components of ML operations, including certificate authorities, operating systems, container images, code, dependencies and ML models employed.

Many organizations today are struggling with how to meld MLOps and DevOps processes. The rate at which AI models are created and updated is typically much slower than the rate at which the applications those models are infused is typically much slower. Data scientists are now coming to terms with how to deploy and update AI models within the context of a larger set of DevOps processes.

In the wake of the COVID-19 pandemic, interest is AI is rising. Organizations are looking for ways to leverage machine learning algorithms to both drive down costs and increase revenue by automating a wide range of tasks. A recent global survey of 2,737 information technology and line-of-business executives conducted by Deloitte finds nearly two-thirds of the survey respondents (61%) said AI will substantially transform their industry in the next three years. More than half (53%) also noted they have spent more than $20 million during the past year on AI technologies and the talent required to master them.

However, the same survey noted over half the respondents (56%) said their organization is slowing adoption of AI technologies because of the emerging risks. Obviously, improving the debugging process for applications that incorporate machine learning algorithms could go a long way toward reducing those risks.

It’s not clear to what degree AI might one day transform the application experience. It is safe to say that eventually, almost every application will incorporate machine and deep learning algorithms to automate one or more processes. Those algorithms won’t eliminate the need for humans as much as they will automate rote processes that today consume more time than anyone prefers. The challenge, of course, will be making sure algorithms perform those tasks consistently as more data becomes available.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x