Skip to content

Chapter 5

Conclusion & References

 Machine Learning represents a valuable asset for companies, however its full potential can only be reached by following the principles of MLOps, as to optimize the process of designing, developing and deploying models. 

MLOps promotes the utilization of ML pipelines to define the workflow that brings a model to life. Pipelines are composed of modular components, each isolating a specific task, and are compiled to a static template. This encourages re-usability and Agile development. Moreover, the automation of tasks reduces significantly the risk of error compared to manual deployment of new models. 

GCP provides an optimal environment for the development of machine learning solutions. Google Cloud Storage allows the user to organize and store data into buckets, while guaranteeing availability and granular access control. BigQuery can benefit any ML project by providing a solution for data warehousing, data exploration and feature engineering, as well as native integration with visualization tools. Ver tex AI offers a set of ML products to suit all needs, from ready-to-use APIs to deployment of fully customized models. An ML engineer can then choose the best solution, depending on the scope of the project and the desired outcome. 

Applying MLOps principles to ML projects is essential for an optimal delivery. Moreover, GCP provides ML engineers with services that accelerate model development and ensure that the deployed products are effective. 

References 

  • Google Cloud. End to end MLOps. url: https : / / cloud . google . com / architecture / mlops – continuous – delivery – and – automation – pipelines-in-machine-learning. (accessed: 12.01.2023). 
  • Google Cloud. Resource hierarchy. url: https : / / cloud . google . com/resource- manager/docs/cloud- platform- resource- hierarchy. (accessed: 10.01.2023).