Abstract | ||
---|---|---|
ABSTRACTWe present In-network Optical Inference (IOI), a system providing low-latency machine learning inference by leveraging programmable switches and optical matrix multiplication. IOI consists of a novel transceiver module designed specifically to perform linear operations such as matrix multiplication in the optical domain. IOI's transceivers are plugged into programmable switches to perform non-linear activation and respond to inference queries. We demonstrate how to process inference queries inside the network, without the need to send the queries to cloud or edge inference servers, thus significantly reducing end-to-end inference latency experienced by users. We believe IOI is the next frontier for exploring real-time machine learning systems and opens up exciting new opportunities for low-latency in-network inference. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1145/3473938.3474508 | COMM |
Keywords | DocType | Citations |
In-network computing, Edge computing, Optical neural networks, Programmable switches, Machine learning inference | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhizhen Zhong | 1 | 0 | 0.68 |
Weiyang Wang | 2 | 0 | 0.34 |
Monia Ghobadi | 3 | 433 | 27.10 |
Alexander Sludds | 4 | 0 | 1.35 |
Ryan Hamerly | 5 | 2 | 1.06 |
Liane Bernstein | 6 | 1 | 1.37 |
Dirk Englund | 7 | 0 | 0.34 |