top of page
7.jpeg

Inclusive and Accessible Design

Work with outside resources as well as independently study to increase inclusivity and accessibility of materials. This includes making informed decisions around text styles, color palettes, audio and images to supplement text and creating tools which support conventional keyboard navigation. 

“Abydos,” The Emma B. Andrews Diary Project

Network Visualization Prototype

Sarah Ketchley requested I work up a few models for the network visualization. I drew up a few rough sketches, which can be seen below, but I felt there was something important to the way that the side view, overhead view and tiered view interacted with one another that I hoped to convey to the rest of the team in video format.

That said, I am not sure I completely landed the movement in this video as it was my first attempt with Adobe After Effects' 3D function. Hopefully, this video can at least get a conversation started around this concept and help shape the final product. The basic elements of this video can possibly be revised and iterated on for the proof of concept material later on in the project. More insights into the inclusivity and accessibility work that informed this design's features can be gleaned from the week two reflection.

Rough Sketch for Prototype Video

20210706_135633(0).jpg

Haptic Sound Design for Accessible Navigation

While the bold color palette I chose for the Fields Network Visualization all clear the color differentiation requirements for the WWWC's Accessibility Guidelines, I wanted to see if I could possibly create scaled audio feedback to supplement the visual navigation cues. I imagined the different nodes hitting different notes as they appeared chronologically in the dynamic visualization in order to create another level of comprehension in the tool. Obviously, the video isn't an interactive medium, but I thought it might be an interesting experiment that could possibly be implemented in future iterations of the tool. 

Haptic Sound Design

In order to create the node tone, I recorded my kalimba, my guitarron and an instrument called a tremoloa, which I didn't wind up keeping in the mix. After I recorded a few different tones and notes, I merged the recordings. Initially the sounds were around three seconds, this was shaved down to 1 second until I ultimately decided a half second was an ideal blip. After smoothing out the shape and increasing the bass, the tone felt appropriately haptic, or relating to a sense of touch.*

First Iteration

Once I had the tone, I created 12 different versions of the sound for each field that would be represented in the visualization. The tones were pitched up in two steps (thirds sounded ominous for some reason) using the Ableton warp function. 

Second Iteration

Initially, I was imagining that the tones would be played on every node, and in concert, if they overlapped. This turned out to be a hilariously bad idea. Logistically, there is no way I could arrange the sound cues for 257 nodes and, even if I could, my first few tests were incomprehensible if not unlistenable. Overlapping tones would require users to understand chords, which is prohibitively complicated. Instead, I chose to determine which field had the highest representation in a date range and limited the cues to that one field. 

Third Iteration

While this prototype for using tone to reinforce the introduction of a node on the timeline may be difficult to track when there are this many nodes on a single screen, it may be a useful addition to our other visualization which tracks EBA on a day to day basis, where there is generally between 1-10 figures that appear. Having attributes that trigger a scaled tone either when they appear or when they are clicked on could be a helpful design feature that increases the tool's overall accessibility. 

 

*Could I have just used a generic sine wave instead of going through this incredibly detail-oriented analog recording/editing process? ¯\_(ツ)_/¯

bottom of page