B2B SaaS
Workflow Optimization
Complex Systems

1:1 Visual Conversation Journal

1:1 is an interactive platform that transforms recorded dialogue into dynamic, ASCII-inspired generative art. By analyzing the "unseen" layers of a conversation—tone, rhythm, and emotion—the system creates a living visual journal that makes human connection explorable and tangible.
Visit my website
Overview
Digital communication reduces rich, human interaction to flat, cold text. Standard transcripts fail to capture the nuance of speech—the pauses, hesitations, and emotional shifts that convey true meaning. The challenge was to design a system that bridges the gap between raw data and human sentiment.
Problem
The research phase focused on how humans communicate beyond words. I studied Jefferson Transcription—a linguistic method used to map the "music" of speech—to determine which conversational cues were most vital to visualize.
Key Insights
Information Overload
Visualizing an entire conversation at once is overwhelming; users need specific "entry points" to filter data.
Trust & Transparency
A visualization is only effective if the user can see the "why" behind it. Linking the transcript directly to the generative form is essential.
Visual Language
Clinical charts feel too detached for emotions. ASCII was chosen to provide a "digital-organic" texture that feels both analytical and expressive.
Research & Discovery
System Architecture: The Logic
Ingestion
A Python back-end segments raw MP3 files into speaker-specific utterances.
AI Analysis
GPT-4o classifies segments into 92 emotional categories and extracts acoustic markers like speed and volume.
Data Bridge
Using Cursor as an AI-native development environment, I architected a JSON framework that feeds this data into the front-end for real-time rendering.
Designed a custom full-stack pipeline to translate raw audio into a meaningful visual experience. This required orchestrating AI as a logic engine rather than just a content generator.
To make the complex data digestible, I designed three distinct view modes, allowing users to pivot their focus based on their goals.
Timeline View
Displays the complete history from the first conversation to the last, allowing users to filter the chronological flow by specific categories.
People View
Groups conversations by individual speaker, visualizing interaction volume to clearly show who the user communicates with the most.
Emotion View
Identifies the most frequent emotional patterns across the user's history. This allows users to categorize and explore their past based on dominant feelings rather than chronological dates.
Navigating the Data: Three View Modes
The center of the experience is the generative ASCII blob. This is the emotional core of the conversation, designed to move beyond static data and create a living representation of dialogue.
Real-time Evolution
Displays the complete history from the first conversation to the last, allowing users to filter the chronological flow by specific categories.
Aesthetic Logic
Groups conversations by individual speaker, visualizing interaction volume to clearly show who the user communicates with the most.
Visceral Feedback
The visualization provides immediate, emotional feedback on the "vibe" of the conversation without requiring the user to read text.
The Conversation Page
Outcomes & Learnings
Systemic Design
This project shifted the focus from static UI to designing the underlying rules and logic of a generative system.
Technical Agility
By using AI-assisted development tools like Cursor, I was able to deliver a functional, full-stack product, bridging the gap between design and engineering.
Brand Identity
Successfully turned a technical constraint (ASCII) into a unique brand identity that balances high-tech analysis with human-centric expression.