12 Months of Designing for AI : February 2024

February 2024

Its been a wild few weeks in AI land! Between Musk and Altman firing shots at each other, Gemini’s “woke” troubles, and Sora dominating creator AI, the landscape shifted drastically, as have the possibilities.

Gemini’s woes in particular have been a stark commentary on where UX Designers fit into the AI landscape. Here’s a single sentence summary of what went down:

Gemini internalized Google’s diversity and inclusion standards to an extreme that compromised historical accuracy by swapping out Caucasian and Anglo Saxon figures with … well, any one but them.

Get a deep dive on TechCrunch ↗ 

Terms

  • GPT: Generative Pre-trained Transformers
    by OpenAI that generates information.
    Why you should know it: As we begin incorporating more GPT capabilities into applications, assuring quality, validity and sensitivity comes from using unbiased data to train the technology meant to alleviate our 

  • RAG AI: Retrieval Augmented Generation AI
    What is it:
    The capability or technique to retrieve information from existing sources to inform generative AI results.
    Why should you know it: It has the potential to reduce prompt based interactions, therefore reducing the expertise a user might need to build up to use generative AI successfully.

Tools

  • Learn: There’s an AI for that ↗
    A delightful aggregator of all the AI tools coming out on the daily. Its been useful in learning the types of tasks 

  • Read: The Shape of AI ↗
    An emerging compendium of AI UI patterns, heuristics and usability practices, Emily Campbells’ passion project is quickly gaining recognition for its utility.

Thoughts

  • Learning the difference between GPT and LLMs was rather anticlimactic. I was definitely my newness to AI concepts that built up the difference and carried my imagination far and wide with the possibilities, but it is safe to sum up the difference in a single sentence:
    GPT is a type of LLM.

  • RAGs have been a fun thing to learn about! The capability for GPTs to retrieve information from relevant sources implies less reliance on prompts from users, and more time to insights. Its like the concept of training associates- once you’re done, they know where to go, what to do and how to do it. RAG + GPT = less burden on the user to instruct. This does have viable implications on increasing utility of various apps, etc. etc. for users, but it could come at the cost of increasing distrust. The current prompt input interaction affords users visibility into the action/consequence relationship. But with RAG implementation, that visibility is lost…unless its intentionally designed to stay visible. Enter UXDs 💪

Tinkerings

  • I completed the IxDF AI course by Ioana Teleanu and thoroughly enjoyed it. She has structured a course that’s great for foundational information, frames technical concepts in user experience terms, and gives a great mindset to approach incorporating and design for AI powered experiences.
    Referral: The annual IxDF membership is well worth it- it includes courses on AI, basics of UX and so much more! If you’re considering becoming a member, use this link to earn a couple of free months for both of us :) 

  • I also took time to try Google Gemini. Its leaps from ChatGPTs simplistic interface and its been nice to see the visual design flair along with some experience improvements. The task of writing prompts is still…tedious. But maybe a pivot to designing from engineering prompts is overdue?


About This Series:  I began exploring AI application in B2B SaaS products in January 2024. It isn’t lost on me that there are few precedents for designing for AI, or even researching and iterating for it. In this series, I share tools, terms, thoughts and tinkerings, along with some lived experience about how cross-functional collaboration evolves, and more as I grow into a UX Designer for AI.

 
Previous
Previous

12 Months of Designing for AI : March 2024

Next
Next

12 Months of Designing for AI: January 2024