Mercurial
view love/README.md @ 143:276bb3555034
Should work now.
| author | June Park <parkjune1995@gmail.com> |
|---|---|
| date | Fri, 09 Jan 2026 13:13:27 -0800 |
| parents | cf9caa4abc3e |
| children |
line wrap: on
line source
# AI (Love) This is where I create AI chat bot. Code goes like below - Epi - FE repo - Vite and tanstack routers - Mostly using tailswind - Poppy - BE repo - python with fastAPI Code is pretty much self explanatory. ## How to start `docker-compose up` or `docker compose up` I don't really use the redis anymore, but I can see that being an improvement if there is more than 2000 RPS, but it will never get there so it should be fine. ## TODO - proper chain of thoughts? - maybe expose the mcp - use local LLM. I wonder if I can get free API keys that I can use with threshold vlaues.