Wednesday, December 5, 2018

Microsoft open sources high-performance inference engine for machine learning models

Microsoft yesterday announced that it is open sourcing ONNX Runtime, a high-performance inference engine for machine learning models in the ONNX format on Linux, Windows, and Mac. ONNX Runtime allows developers to train and tune models in any supported framework and productionize these models with high performance in both cloud and edge. Microsoft is using ONNX Runtime […]

Read More: Microsoft open sources high-performance inference engine for machine learning models



from MSPoweruser https://ift.tt/2BQP5To
via IFTTT

No comments:

Post a Comment