How Research, Sound Design, and AI-Based Development Shaped This Accessibility Tool
TiniAid was born from a need for better accessibility for neurodivergent individuals, particularly those with ADHD. This page outlines the entire journey from initial inspiration to research, sound design, development, and user testing. The goal? To create an accessibility tool that makes focus easier for everyone, without friction or distractions.
The idea for TiniAid began during an accessibility course that completely changed my perspective. Before this, I hadn't fully considered how accessibility extends beyond visual and physical needs, neurodivergent individuals, particularly those with ADHD, require different forms of support. That's when I realized: to help others, I should first help myself. Since I personally have ADHD, I wanted to create something that could aid my own focus, knowing it could also benefit others.
This project wasn't just about building a tool; it was about rethinking how accessibility could integrate seamlessly into people's workflows. Instead of making an app that users have to download, introducing friction and distractions, I wanted to create something that's simply there when you need it, embedded in the environment where users are already working.
My research paper titled "Sound as Support: Techno Music Features for ADHD-Friendly Digital Accessibility" formed the foundation of this project. Studies suggest that repetitive beats, long loops, amplitude modulation, and ambient soundscapes create a structured auditory environment that enhances concentration for neurodivergent individuals.
To implement this, I needed to extract, modify, and test multiple sound layers to ensure they actually improved focus rather than creating distraction. Using DJ Console and DJ software, I isolated key frequencies, modified BPMs, and created long-loop variations. However, this came with its own challenges: some loops were too short, some frequencies were too sharp, and some sound choices unintentionally induced anxiety instead of focus. Through testing, I adjusted these issues to create a more effective sound experience.
With the sound experience refined, the next step was development. Since I wanted this tool to be as frictionless as possible, I explored AI-based development using Vercel. The goal was to ensure seamless integration into websites, removing the need for users to switch applications or manually configure settings.
Learning development wasn't just about writing code, it was about understanding how things actually work. I explored different embed methods like iframe vs. direct inject, optimizing UI consistency, and ensuring the bot could load dynamically without breaking website layouts. The process also involved countless bug fixes, ensuring that UI components like fonts, sliders, toggles, and animations matched my vision.
// Audio controller implementation
class AudioController {
constructor() {
this.audioElements = new Map();
this.volume = 0.5;
this.isPlaying = false;
this.currentBPM = "100";
// Preload audio files
this.preloadAudioFiles();
}
async play(components) {
if (!this.isPlaying) return;
// Get the correct audio keys
const audioKeys = this.getAudioKeysForComponents(components);
// Play all components
for (const key of audioKeys) {
await this.playAudioFile(key);
}
}
}While TiniAid has come a long way, there's still much more to explore and improve. The journey doesn't end here, I'm committed to expanding this tool's capabilities and reach. Here are some of the key areas I'm focusing on for future development:
Develop AI-driven personalization that learns user preferences and adjusts sound profiles automatically based on usage patterns and feedback.
Create additional themed soundscapes for different environments and tasks, from deep work to creative brainstorming sessions.
Extend TiniAid beyond web browsers to desktop applications, mobile apps, and smart home devices for a seamless experience.
Want to contribute or stay updated? Reach out through the contact form on the home page.