Benchmarking Machine Learning Inference Over Streaming Data
||Benchmarking Machine Learning Inference Over Streaming Data|
||NAISS Small Compute|
||Sonia florina Horchidan <email@example.com>|
||Kungliga Tekniska högskolan|
||2023-03-01 – 2024-03-01|
This project aims to compare the existing methods in pre-trained model deployment over streaming data. We will investigate stream processors such as Apache Flink, Kafka Streams, or Spark Streaming. For model serving tools, we chose ND4J, ONNX, TensorFlow SavedModel, TorchServe, and TensorFlow Serving. The investigation will serve as guide for developers that need to integrate model serving into their pipelines.
This project is an extension of an already published research paper: https://dl.acm.org/doi/abs/10.1145/3533028.3533308