top of page

The New Professional

This is my prediction about who thrives. Not just in an AI enabled world. But next. It's not the fastest adopter. Not the most technically fluent. It's the professional with good taste.


I mean something specific. The ability to evaluate output, AI or human, against a standard that isn’t just speed or volume. To know when something is right, not just finished. To hold the organization’s judgment when the tools can’t.


Who Stays Responsible


Most organizations won’t transform overnight. What will change sooner is how they prepare, and at what pace that preparation fits their culture and their clients. The new professional is the one who can operate in both worlds at once. Fluent in what’s coming. Grounded in what still requires a human being.


Robert Pirsig wrote about this in Zen and the Art of Motorcycle Maintenance. Not about AI, but about quality, and what happens when we stop attending to it. His argument was that the mechanic who genuinely cares about the machine produces different work than the one running a checklist. Not because caring is sentimental but because it changes what you notice. Taste is not instinct. It is built through exposure, through accumulated judgment, through the experience of getting things wrong before getting them right.


It is also built through the specific kind of feedback that only comes from someone further along. The correction that names not just what is wrong, but why it matters. That transfer requires a relationship. It requires someone willing to give it and an organization that makes room for it.


Pirsig’s novel was published during another moment of technological anxiety. The question he was asking is the same one sitting underneath this shift: when systems get faster and more capable, who stays responsible for whether the work is good?


That responsibility doesn’t delegate. It develops, or it doesn’t.

Share this article with a colleague:

March 10, 2026 at 6:57:03 a.m.

Junk Food Work


There is a reason this kind of professional is becoming harder to find. And it has less to do with AI than most people assume.


A significant portion of early-career professional experience has always been junk food work. Repetitive, low-judgment tasks that produce volume without building much. Formatting documents that didn’t require thought. Drafting correspondence from templates. Preparing binders. Chasing signatures.


In a law firm, it looked like a junior spending three hours reformatting a precedent agreement someone senior would review in fifteen minutes, or a legal assistant manually tracking deadlines in a spreadsheet that a system could generate automatically. The work got done. The box got checked. But the professional metabolized very little from the experience.


Junk food work was never the point. The point was what surrounded it. The strategy conversation between a senior partner and a client. The debrief after a difficult negotiation. The correction delivered in real time by someone who had seen the same mistake made twenty years earlier. The judgment that transferred through proximity and opportunity.


That opportunity had a name. Mentorship. Not the formal kind with assigned pairs and quarterly check-ins, though that has its place. The kind that happened because people were in the same room, working on the same problem, close enough to the consequences that something real transferred. A junior professional didn’t just watch how a senior partner handled a difficult client. They felt the temperature of the room change. They heard what got said and what didn’t. They learned something that no process document could carry.


AI removes the junk food. That sounds like a gift.

It might also be a problem.


If the work that gets automated is the work that surrounded the judgment calls, the new professional may arrive at the threshold of real decision-making without the accumulated exposure that taste requires. They will have output. They won’t have the education that came from producing it badly first.


And there is a deeper, underlying problem. It’s not just that the skill erodes when AI does the work. It’s that the incentive to build skill disappears. The incentive to build skill looks like struggle with a purpose. A task that is hard because it is supposed to be. A correction that names what was missed and why. A process that is slower because the slower version is also the one that teaches something.


Consider a junior accountant whose job once required doing the reconciliation manually. Tracing errors, understanding why they happened, learning to recognize what wrong looks like from the inside. Now AI flags the anomaly and suggests the fix. The junior accountant approves or dismisses. They’re getting faster. They’re not getting better at knowing what to look for, because they never had to look. Three years in, they’re efficient and incurious. The skill didn’t erode. It never formed.


Or consider a first-year associate at a law firm. They used to draft the memo badly, get it marked up, understand why the argument didn’t hold, and try again. The redline was the education. Now AI produces a competent first draft. The associate cleans it up, the partner approves it, the client gets a better work product faster. Everyone wins.


Until the associate is a third year who has never constructed a legal argument from scratch. They’ve edited hundreds. They haven’t built one. When the AI is wrong in a way that looks right, they don’t catch it. Because they were never trained on what wrong looks like from the inside.


It’s not laziness. It’s rational adaptation to the environment they were handed. The incentive to struggle through the hard version disappeared because the easy version was always available. That’s a design problem, not a character problem. And it’s the organization’s responsibility to solve it. And almost none of them are.


If the mentorship that used to happen alongside that work hasn’t been deliberately redesigned, they won’t have that either.


The Skills That Hold


For most of the last century, soft skills were evaluated as cultural fit. Could this person read a room? Would they represent the firm well? Did they communicate in a way that felt right for the environment? The assessment was real, but the framing was limiting. Soft skills were treated as organizational accessories — important for belonging, difficult to measure, rarely counted as professional capital in the way that technical credentials were.


That framing is now obsolete.


Judgment, communication, ethics, mentorship, context. These are not cultural ornaments. They are transferable, compounding professional assets. The professional who can construct a clear argument, navigate ambiguity, build trust across distance, and name what a piece of work is missing carries value that moves with them.


It does not belong to a specific office, a specific team, or a specific set of tools. And as systems grow more capable of producing output, the ability to evaluate that output against a standard that isn’t just speed or volume is becoming the skill that holds everything else together.


Organizations that have not made this shift, that still treat soft skills as fit criteria rather than capability criteria, are already losing the professionals they most need to retain. They are also struggling to attract them.


There is something else worth naming. The capabilities that define the new professional — noticing what others miss, sitting with problems that don’t resolve neatly, evaluating work against a standard that isn’t just “this is how we’ve always done it” — aren’t new. They’ve been present in professionals that organizations have historically overlooked or underestimated.


Many neurodivergent and atypical professionals think, evaluate, and question in exactly these ways. Not as a special gift, but as a different cognitive approach that never fit the environments most firms were built around. Some were passed over at hiring. Others learned to mask so completely that their actual strengths never had room to surface. The organizations now wondering where to find professionals who can think critically alongside AI might start by asking who they’ve already been designing out.


Presence without intentional design is not mentorship. It is proximity.


The new professional does not need everyone in the same room. They need access. Structured, consistent, and delivered through whatever form serves the relationship and the learning. That access looks different depending on the organization and the individual. It might be a scheduled one-on-one that happens without exception. An impromptu conversation after a difficult client call. A practice group session where the chair goes beyond the agenda. A retreat that creates the conditions for relationship across geography. A deliberately designed approach to electronic communication — because the professional who can build trust and transfer judgment over email or video is developing a skill that compounds across every working environment they will ever inhabit.


The best mentorship programs that organizations are building do not choose between in-person and remote. They design for both. They create access through multiple forms and hold someone accountable for whether that access is reaching the people who need it.


The new professional already knows the difference. The question is whether their organization does.


In Practice


What follows is not a checklist. It is a set of questions and observations designed to help you see what is already present in your organization, and what may be quietly disappearing.


What does junk food work look like in your organization?


Before you can design for its absence, you need to name it. In most professional services firms, junk food work lives in the gap between what a task requires and what executing it teaches. It is not always obvious. Some junk food work looks important. It produces output. It fills time. It appears on a to-do list and gets crossed off.


The diagnostic question is not whether the work is necessary. It is whether doing it builds anything in the person doing it.


In a law firm, junk food work might look like this: a junior associate manually compiling a due diligence checklist that a template could generate, a paralegal reformatting a closing book that software could assemble, or a legal assistant chasing execution copies on a matter where a workflow system would send automated reminders. Necessary tasks. Low development value. Now candidates for automation.


Start with two lists


First: three tasks currently performed by junior professionals that could be automated without meaningful loss to their development. Work that produces output but teaches little. These are your early integration candidates.


Second: three tasks that, if automated tomorrow, would leave a gap in their development you haven’t planned for. Work where the doing was also the learning. These are your design problems.


The distance between those two lists is where your AI integration strategy begins. Most firms are only making the first list. The second one is where the real work is.


Where does judgment transfer in your organization?


In the best firms, judgment transfers through relationship and proximity. A senior partner who debriefs after a difficult client call. A practice group chair who explains not just what the decision was, but why. A mentor who says, “that’s technically correct, and here’s what it misses.”


These moments are not formal. They are not on a calendar. They are also not accidental in firms where culture is working. Someone designed the conditions for them, even if they didn’t call it design.


But here is what AI is exposing: not every practice group chair is a good mentor or educator. Seniority and technical excellence do not automatically transfer into the ability to develop people. In firms where judgment transfer happened because people were physically present and a senior colleague might step in, that gap was often covered by proximity. Remote and hybrid work removed the cover. AI is removing it further.


The question is no longer just where judgment transfers. It is who is accountable for making sure it does, and whether your firm has any mechanism for that accountability beyond assumption.


Ask yourself: in the last month, where did judgment transfer in your firm? A junior associate who sat in on a client strategy call and heard how a partner framed a difficult recommendation. A new hire who received a direct and specific correction on a first draft rather than a tracked-change revision with no explanation. A practice group meeting where the chair went beyond the agenda to discuss why a matter had gone sideways.


Who was in the room for those moments? Who wasn’t? What would have to change for more of your professionals to have access to that transfer?


What is your firm’s readiness sequence?


Before tool literacy, there is organizational readiness. Before organizational readiness, there is an honest assessment of how knowledge currently moves through your firm and where it doesn’t. Most firms skip that assessment entirely. They move directly to deployment and discover the gaps only after they’ve been accelerated.


Start here instead.


Identify. The new professional is already in your firm. They may not be the most senior person in the room. They may not have the longest tenure or the most visible file. They are the one who asks whether the output is right, not just whether it’s finished. The one who can articulate what a piece of work is missing and why. The one who builds trust across distance and communicates with enough clarity that their judgment travels without them in the room.

Can your firm identify that person? Not in the abstract. Specifically. By name, by practice group, by what you’ve observed them do. If you can’t, that is the first design problem. You cannot nurture what you haven’t named.


Nurture. Once you’ve identified the new professional, the question is what conditions your firm is currently providing for their development — and what is missing. This is not a training budget question. It is an access question. Are they in the rooms where judgment is exercised? Are they receiving feedback that names what was missed and why, or corrections that simply fix the output? Do they have relationships with senior professionals who are willing to transfer not just knowledge but the reasoning behind it?


A firm that skips to tool deployment without building these conditions first is not integrating AI. It is accelerating existing gaps.


The mentorship and communication audit that precedes any AI integration strategy is not a survey. It is an observation. Where does judgment transfer reliably in your organization? Where is it assumed to transfer but doesn’t? Where has it stopped transferring entirely and no one has named it yet? That audit tells you what your firm runs on before you introduce a system that will change how it runs.


Listen. This is the step most organizations skip entirely. The new professional often already knows what isn’t working. They can see the junk food work. They feel the absence of intentional mentorship. They know which senior colleagues transfer judgment and which ones simply transfer tasks. They have a view of your firm’s development conditions that no partner meeting agenda will surface on its own.


The question is whether your organization has created any mechanism for that signal to reach the people with the authority to act on it. Not an annual engagement survey. Not an open door that no one walks through. A genuine, structured, low-stakes channel for the new professional to name what they need and have someone listen with the intention of doing something about it.


This applies with particular weight to professionals who have historically had to mask or adapt to be heard at all. The new professional who is neurodivergent or atypical may have the clearest view of where your systems are failing — and the least confidence that naming it will be received rather than managed. Designing for their voice is not a separate equity initiative. It is part of the same design problem.


The firms that will develop the new professional are not necessarily the ones with the most people in the office or the most sophisticated AI tools. They are the ones that have asked the right questions, built structures that can deliver development consistently, and created accountability for whether it happens.


What are you investing in?


The new professional requires investment that doesn’t show up immediately on a balance sheet. Time with senior professionals. Exposure to decision-making conversations. Feedback that is specific and honest. Mentorship that is structured enough to happen consistently and human enough to teach something.


None of this is new. What is new is that the conditions which used to produce it by default — shared physical space, common workflow, proximity to consequence — are no longer guaranteed. They must be designed. And the design has to account for a world that is hybrid, increasingly digital, and moving faster than most mentorship programs were built to handle.


Designing for the new professional is not work that gets done between client calls. It requires the kind of focused attention that existing workloads rarely protect. Someone must hold the question, map what is present and what is missing, and build the structures that make development possible. In most firms, that person does not currently exist in the organizational chart. That gap is part of the design problem.


That question, what mentorship really requires in an AI-enabled, hybrid professional environment, is one worth examining carefully. It is also the subject of the next piece in this series.


The questions above are starting points, not an audit. If one of them surfaces something worth examining in your firm, that is the conversation worth having. Koehler Consulting works with professional services organizations on exactly this kind of organizational design. If you'd like to think through what this means for your firm specifically, I'd be glad to talk.

bottom of page