Offering LLM Services over Edge Computing
The current model for LLM services involves receiving the service through cloud sources for all users. However, this solution is expected to have scalability problems. In this project, we intend to test the performance of the alternatives for offering LLM Services over Edge Computing.