Designing the Affordance Field
If you are responsible for AI in your organization, you are not merely rolling out tools. You are designing an affordance field: the landscape of what is easy, what is hard, and what feels allowed.
The interface is one of your most powerful instruments. A blank chat box at the center of the screen teaches people to treat the system as a toy or a search bar. An environment that opens with guided workflows tied to real tasks teaches them to treat it as part of the job. Signifiers and defaults shape expectations long before any verbal message does. So take the time to show people how AI can generate the quarterly forecast, the call-center script, the creative brief…
Spaces and rituals matter just as much. If AI conversations live only in a monthly steering committee or a single experimentation channel, you have already made a cultural decision: this is specialist work, done elsewhere. Contrast that with teams that host regular AI office hours, cross-functional show-and-tell sessions, or daily standups where people are explicitly invited to share what they tried with the model yesterday. The physical and digital spaces where AI is discussed either invite broad participation or keep it gated.
Finally, feedback loops need to be tuned to behavior, not just usage. Login counts and token volumes tell you how often people touch the system, not whether culture is changing. More meaningful signals might include how many teams routinely run pre-mortems with simulation, how many decisions are annotated as AI-assisted, or how often prompts are shared across teams rather than staying within a single group. When you track these patterns, you start to see whether new habits are taking root.
Designing the affordance field with the ABC in mind turns the question from “How do we force adoption?” into “How do we make a new way of working feel natural and inevitable?”
Who Gets to Adapt?
In many organizations, AI affordances arrive first for those already closest to power: strategy teams, senior executives, advanced analytics groups. They get the sandboxes, the workshops, the time to play. Frontline workers, whose jobs might be most transformed by these tools, often receive only a thin slice: a chatbot bolted onto a legacy workflow, or a “pilot” they hear about in all-hands meetings but rarely touch.
If we take the ABC seriously, that asymmetry matters. On the frontline, attitudes quickly become: this is something done to us, not with us. Behaviors follow: quiet resistance, creative workarounds, minimal compliance. Over time, culture settles into a two-tier system, with AI-native elites on one side and everyone else working around systems they neither trust nor understand.
The same pattern plays out in our communities. Some neighborhoods get AI labs in the public library, evening classes in data literacy with childcare on site, and local mentors who know how to help people turn existing skills into new opportunities. Other neighborhoods encounter AI mainly as an impersonal force: automated call centers, opaque scoring systems, new forms of surveillance.
Moving from tools to culture means designing the affordance field so that many people can climb the ABC ladder, not just a chosen few. This can mean carrying AI coaching out to plants and branches instead of keeping it at headquarters, translating playbooks into the languages and contexts of different roles, and giving frontline teams both permission and budget to request new features and signifiers instead of merely accepting whatever the platform team released.
From Tools to Culture: A Different Brief
If you are a leader, designer, or technologist, the real brief is wider than it looks. It is not enough to deploy tools, publish a set of policies, and run a round of training. Those are necessary, but they are not the finish line.
Instead, start by asking what attitudes your people actually hold about AI right now. Where are they curious? Where are they afraid? Where have they become numb or cynical? Then think about the everyday behaviors that would signal real extension of human judgment: more divergent thinking before decisions, more explicit reflection on how conclusions were reached, more willingness to let people at the edge of the organization shape how tools are used.
From there, imagine the kind of culture you would be proud to see emerge from these tools in five years. In that imagined culture, how do people talk about risk? How do they share learning? How do they argue with and alongside their models? Once you can see that future, you can work backward: what affordances, what signifiers, what small rituals today would make that future feel less like a slogan and more like “just how we do things”?
Affordances change people; people change affordances. If you work the ABC with intention you will end up with more than better tools. Tuning attitudes, scaffolding behaviors, and letting culture emerge rather than hoping it appears will build a new kind of organization, one that has adaptability at its core.