ESL students graduate at a much lesser rate than native-speaking students and score lower on standardized tests.
Bridge the gap with a universal language — pictures.
Using image recognition technology, build a translation tool for English Language Learners (ELLs).
Only 63 percent of ELLs graduate from high school, compared with the overall national rate of 82 percent. Of those who do graduate, only 1.4 percent take college entrance exams like the SAT and ACT ("English Language Learners: How Your State Is Doing," NPR).
Technology can't fix a broken education system, but maybe it could provide relief for students who need help.
Google seemed like an obvious choice for this app. Not only do they have all the existing infrastructure, such as a robust image recognition API and Google Images, the most comprehensive image search on the web, but they also have a demonstrated commitment to education and expanding access to learning.
concept video (2 min.)
Use the phone's camera to run any visual through Google's Vision API.
Save and store any query to study and review it later.
Hung up on a word you don't know? Run it through Google's proprietary image search to get a visual translation.
I started the design process with pen and paper. When I was confident in the look and feel, I moved to Sketch. Above you can see iterations of the design, of varying levels of fidelity. The last screen is a concept design for the reverse image search feature, which I ended up discarding in favor of the current design.
Since Google Depict would be part of Google's suite of apps, I used the Material Design system. I chose the Amber palette as the primary color scheme, which is reminiscent of a No. 2 pencil. Being an app focused on language, I borrowed a lot of design elements and functionality from Google Translate.
Contact me at firstname.lastname@example.org.