Kehinde Fatosa

Kehinde Fatosa

Insight7Insight7
Content and Community Manager
NotionSpotifySlack
Call Evaluation by Insight7
Weaviate gives us fast, semantic search across unstructured call data, crucial for surfacing insights in real time. Its native vector support and scalability made it the best fit for building an AI native platform like Insight7.

What's great

scalability (1)semantic search (1)vector embeddings (2)
1 view
Call Evaluation by Insight7
We chose MongoDB for data storage because of its flexibility in handling unstructured and semi-structured data at scale. MongoDB’s document based model allows us to store complex metadata and insights without rigid schema constraints, making it ideal for fast iteration and evaluation at scale.

What's great

schema flexibility (5)high scalability (7)flexible document model (5)
Call Evaluation by Insight7
We selected OpenAI to power our LLM APIs because of its performance, stability, and continual improvements in conversational and contextual understanding. OpenAI models provide the nuanced comprehension we need to evaluate calls accurately, identifying patterns, surfacing product gaps, and scoring qualitative interactions reliably. Its developer friendly APIs also allowed us to move quickly from prototype to production without compromising on capability.

What's great

fast performance (27)context aware (6)AI integration ease (21)AI developer experience (12)
2 views