Back to Libraries
ONNX Runtime Web
15k
Run ONNX models in browser with WebAssembly/WebGL
License
MIT
Released
2019
Stars
15k
Use Cases
Run PyTorch/TensorFlow models in browser, cross-platform ML inference, edge AI
Code Example
import * as ort from 'onnxruntime-web';
const session = await ort.InferenceSession.create('model.onnx');
const results = await session.run({ input: tensor });