We had limited time due to shipping delays and certain technical issues.
Our shipment got stuck at customs for over a month, and we have been facing power outages and issues here in Kerala due to floods. (I barely had 2 weeks to figure everything out)
Due to the above-mentioned details, we have greatly deviated from the original idea.
Here is my original idea submission: https://www.hackster.io/contests/amd2023/hardware_applications/16898
Then we switched to another idea called LegalEase, where we mainly focused on "Adaptive Surveys: Automating legal issue triaging"
And then, we switched to "AI in CMS" with a bunch of features:
- Search + Chat
- Content Update Notifier
- Blog Page Chat
- Content Readability Optimization
- Quick Document to Structured Content
- Media Alt Text Generation
- Multilingual Content Translator
- Instant Question & Answer Generations
- Asset/Content Classification, Tagging and Categorization
- Adaptive Surveys
- Style Consistent Image Generation
- Complex Form FIlling Assistant
We implemented a couple of them, but they were far from being demoable for a larger audience.
Finally, I switched to a simple no-code chatbot using Flowise using the latest Llama 3.1 70B on Ollama and it obviously works flawlessly thanks to the AMD Pro W7900 45GB memory available for inferencing.
Here is a short demo:
We tried to plug it with Milvus for a RAG based chat, but configuring things beyond flowise removed the no-code aspect so switched to a much slower local vector store, but we kept on facing small-small technical issues along the way.



Comments