HapViz - Visualizing haptic signal descriptions

Role

Designer and Developer

Industry

Academia, tech

Duration

4 months

Background

Over the past year, I collected a large dataset of user-written descriptions for 256 vibrotactile haptic signals — 30 free-text descriptions per signal — capturing how people perceived them in terms of sensory, emotional, and associative qualities.

I designed and built an interactive web interface (React + D3.js) to help haptic designers, researchers, and developers explore this dataset and select signals that best fit their needs.

What the tool does

  • Gallery view: Browse all signals with advanced filters by sensation, emotion, or association.

  • Keyword exploration: Interactive scatter plots group keywords by meaning, showing patterns in how signals are described.

  • Signal details: Click a signal to see its emotional scores, keywords, and category breakdown.

Background

Over the past year, I collected a large dataset of user-written descriptions for 256 vibrotactile haptic signals — 30 free-text descriptions per signal — capturing how people perceived them in terms of sensory, emotional, and associative qualities.

I designed and built an interactive web interface (React + D3.js) to help haptic designers, researchers, and developers explore this dataset and select signals that best fit their needs.

What the tool does

  • Gallery view: Browse all signals with advanced filters by sensation, emotion, or association.

  • Keyword exploration: Interactive scatter plots group keywords by meaning, showing patterns in how signals are described.

  • Signal details: Click a signal to see its emotional scores, keywords, and category breakdown.

How it works

  • Used GPT-4 to extract key sensory, emotional, and associative keywords from 7,680 descriptions.

  • Grouped keywords into categories and calculated emotional scores using the NRC Lexicon (based on Plutchik’s emotion model).

  • Created semantic maps with SBERT word embeddings + multidimensional scaling, so related words cluster together visually.


Views created

The dashboard view allows users to see all signals at a glance. The filters at the panel on the side allows users to filter by sensory, emotional, and associative properties

The plot of all keywords allows users to analyze and select keywords that were used to describe signals. The view can be changed according to the category.

The signal panel allows users to see the properties of the signal. They can select categories and view individual keywords. They can also see the emotions associated with each signal.

Why it’s valuable

Haptic feedback is widely used to improve UX, accessibility, and navigation cues in multimodal applications. By mapping how people describe haptic signals, this tool helps:

  • Designers match signals to desired sensations or emotional effects.

  • Researchers explore language–tactile relationships and train predictive models.

  • Teams discover patterns in perception that might otherwise stay hidden.

The result is an intuitive, visual, and data-driven way to navigate the subtle language of touch

Other projects

Create a free website with Framer, the website builder loved by startups, designers and agencies.