Doubleshot · · 6 min read

☕️ Letting Go of Geometry (Again)

How type design made me see all the “fixed” logos on social media in a new, dimmer light.

☕️ Letting Go of Geometry (Again)

Welcome to the Doubleshot newsletter! As you may have noticed, I’ve moved toward a “whenever it’s ready” model of sending the newsletter. I only want to send you all emails when I have something cool and cogent to share, so I’m publishing them somewhat sporadically.

That said, I’m grateful that you’re here and—as always—appreciate your feedback and ideas :)

Now, let’s get started.


📚 Read This: “Geometric Circles Are Born Diamonds” by Troy Leinster

Troy Leinster of Leinster Type (and one of my teachers at the Cooper Union’s Type@Cooper program) recently published a blog post about the inherently funky nature of geometry on digital displays, called Beware: geometric circles are born diamonds.

As I’ve written before, type design is a practice that really helped free me from a reliance on objective mathematics in design. It made me see all the “fixed” logos people post on social media (especially attempts to cram Google’s “super G” logo into a perfect circle) in a new, dimmer light. And Troy’s post demonstrates exactly why.

Prompted by Pentagram’s brand work for the Guggenheim (shown below), Troy unpacks graphic design’s reliance on optical illusion—perfect circles don’t exist in reality, and especially not on digital displays—and shows through a series of examples how you can successfully create a convincing-looking circle on screen, all while showing that, maybe, the new Guggenheim logo is a “fixed” version that could benefit from unfixing.

Check out Troy’s post here, and sign up for his newsletter where I saw the post first!

🙋 Q&A: UX Research & Strategy Seminar Japan by Members Co.

A few weeks ago, I had the chance to give a talk at the UX Research & Strategy Seminar hosted by Members Co. in Japan (I was in Zürich, delivering the talk remotely). It was a great experience, and well over 200 folks joined as I went through an updated and expanded version of a session I started working on last year: The Real Promise of AI in Design: Beyond Productivity and Towards Intersubjectivity.

Afterward, we had some time for Q&A. The participants submitted some really great questions, and I wanted to share a few of them (along with my answers) with the Doubleshot audience.

Designers will need to create interface spaces that change with AI. What variables will designers need to handle in this process? 

We can look at existing and well-established principles like responsive design or variable type to see the possible ways an interface changes in an AI world, if the AI is automating changes that we define ahead of time. One way that I think about it is from the biggest part of the interface to the smallest and exploring the extremes. So starting with the device you’re on, moving to the window in which the app is running, the main segments of that window, and finally the content and components. Along the way, there could be changes to size, composition, visibility, and all the stylistic variables your system supports. Working on the start and end points of each of those continua can give an idea of how the interface could adapt to a range of situations.

As interfaces evolve, how do you think it will impact physical devices like displays and buttons that users interact with?

This is a really interesting question. Most of the hardware projects I’ve worked on have been in public spaces or kitchen appliances, where physical elements like buttons need to correspond closely with elements on screen. I think a big part of moving into more dynamic interfaces is establishing clear relationships between parts of the interface such as hardware buttons and displays and ensuring they’re preserved even as other elements change.

So, for example, I once spoke with David Reinfurt, the designer who worked on the subway ticket machines for New York City, and he told me that the individual hardware elements like the card reader, keypad, and ticket box were color coded to match the color of elements on the touch screen. This relationship would need to be defined explicitly in software so that it doesn’t change. In terms of displays, I think the direction many people are working on is to make displays more ambient to our environment. Right now, that’s through efforts like foldables, Virtual Reality, and Augmented reality, but I see a lot of effort across the industry to figure out where and how displays can appear and be used by individuals.

With AI-integrated interfaces, users will be able to create and use interfaces that match their individuality, context, and intentions on the spot. Do you think this will transform users from mere “users” into a kind of “creator,” thereby fostering creativity?

Yes, thinking about users as “co-creators” of their experience is a big part of the principles we are working toward in Material Design. We do this by making subsystems like color relational rather than deciding every value. As the interface becomes more dynamic and changes according to the user’s own subjectivity, they should of course feel empowered and in control of that experience. Otherwise it would just feel like a new, complicated update. I think individual users often know best what kinds of adjustments they need to improve their experience.

I would like to ask about the future of UX design through the co-creation of generative AI and humans. How do you think the evolution of generative AI will change the concepts and methods of UX design, and what new design possibilities will emerge?

I think there are a lot of possibilities. Many people talk about the role of AI in design as kind of a spectrum, where one end represents AI in full control of the interface. This, to me, is the most dangerous direction since it would produce untested, fully automated interface that could have negative effects, not just for the business but for users. On the other end of the spectrum is AI as part of a design tool – generating sketches, creating examples of common patterns, things like that. I think AI will definitely exist at this end of the spectrum for a while, and is already accelerating a move away from tactile exploration that was the basis of design for a long time. By that I mean using physical materials like a sketchbook to explore wireframes – people can move to higher fidelity explorations right away, and the temptation to do that is stronger with generative AI.

As I mentioned in my talk, I think the longer-term future of interface design with AI relies on a more advanced version of the technology that’s not just a tool but a kind of orchestrator within the software itself, helping the user make dynamic changes to the interface.

While pursuing user-centered design, what kind of balance do you think is necessary to achieve the business goals of a company? If you have specific examples or practical methods you have implemented, please share them. We would also appreciate your thoughts on effective approaches to maintaining this balance.

I think throughout my career, the argument that has resonated the most with business stakeholders is that we have shared goals. I might not express my goals in the terms of business or revenue, but my goals as a designer are aligned with their goals in the sense that a great experience leads to higher user trust and a higher chance that people will want to engage with your product in a way that helps the business. Specific proposals for design that come from business leadership can then be evaluated and discussed in those terms—this is where research also has an important role in bringing numbers to the table, which is often more convincing than any amount of discussion.

As designers, we are already in the position of needing to understand and synthesize the language of many other roles like software engineering and research in order to do our job, and I think we are also responsible for being able to understand and use the language of business while retaining the reason we got into design in the first place.


That’s it for now, see you in the next one! 

Read next