Skip to main content
Join the AI Learning Hub

Timestamps:

00:00 - Introduction
01:10 - Overview of Leveraging AI in Associations 
06:02 - How Associations Can Use AI to Unlock Their Data
12:22 - Counter-Positioning Your Business for Success
15:19 - Multi-Tiered AI Offerings for Associations
18:28 - The Plummeting Cost of AI Tokens
24:41 - Impacts of Token Costs on AI Accessibility
28:46 - Generational Differences in AI Adoption
34:51 - Why Leaders Need to Adopt AI Now

 

Summary:

In this episode of Sidecar Sync, Amith and Mallory dive into three major topics shaping the future of AI in associations: leveraging an asset base with AI, the rapid decline in AI token costs, and the generational differences in AI adoption. Amit breaks down how associations can unlock the potential of their data to create innovative, revenue-generating services. They also explore how the plummeting cost of AI tokens is making sophisticated AI tools accessible, and why younger workers are more likely to embrace AI in their jobs. Tune in for insights into how AI will transform the association world.

 

 

 

 

Let us know what you think about the podcast! Drop your questions or comments in the Sidecar community.

This episode is brought to you by digitalNow 2024the most forward-thinking conference for top association leaders, bringing Silicon Valley and executive-level content to the association space. 

Follow Sidecar on LinkedIn

🛠 AI Tools and Resources Mentioned in This Episode:

Claude ➡ https://www.anthropic.com
ChatGPT ➡ https://openai.com/chatgpt
Groq ➡ https://groq.com
Llama 3 ➡ https://ai.facebook.com/tools/

⚙️ Other Resources from Sidecar: 

 

More about Your Hosts:

Amith Nagarajan is the Chairman of Blue Cypress 🔗 https://BlueCypress.io, a family of purpose-driven companies and proud practitioners of Conscious Capitalism. The Blue Cypress companies focus on helping associations, non-profits, and other purpose-driven organizations achieve long-term success. Amith is also an active early-stage investor in B2B SaaS companies. He’s had the good fortune of nearly three decades of success as an entrepreneur and enjoys helping others in their journey. Follow Amith on LinkedIn.

Mallory Mejias is the Manager at Sidecar, and she's passionate about creating opportunities for association professionals to learn, grow, and better serve their members using artificial intelligence. She enjoys blending creativity and innovation to produce fresh, meaningful content for the association space. Follow Mallory on LinkedIn.

 

Read the Transcript

Amith Nagarajan: Welcome back, everybody, to another episode of the Sidecar Sync, where all things association and artificial intelligence come together in a fun, interesting podcast. My name is Amith Nagarajan,

Mallory Mejias: And my name is Mallory Mejias.

Amith Nagarajan: and we are your hosts. And before we get going into our topics for today, let's take a moment to hear from our sponsor.

Mallory Mejias: Amith, how are you doing today?

Amith Nagarajan: I can't complain, it is, uh, September, and so here in New Orleans, we are starting to have the end in sight in terms of the crazy weather, and, uh, so we'll see what happens, you know, it's like 45 days from now, it'll not be horrible outside, so

Mallory Mejias: Absolutely, this morning I felt a light, not a dare I say chill, a light change in the weather in Atlanta. So I was really excited for that. I really like the fall. I like doing all things fall the I know it's basic but the pumpkin spice coffees and the candles and I like halloween So i'm really excited to get into this season for sure.

Amith Nagarajan: Yeah, it's gonna be fun, and we were just talking about, uh, being in Utah in a few weeks, or in a couple weeks, actually, we have an annual event for, uh, For blue Cypress, our family of companies, uh, we have our senior leaders from all of the different companies come together about 40, 45 people each year come together up in Park City, Utah, where we have a learning event and, uh, that's always a lot of fun and the weather in Park City truly is fall weather.

There'll be, you know, changing leaves and, uh, cooler temperatures, you know, wake up in the morning with 35, 40 degree temps and all that. And, uh, I'm, I can't wait for that. That's going to be awesome.

Mallory Mejias: It's going to be a great time. Last week, as all of, you know, I was on my own, but Amith, it's nice to have you back, uh, for today's episode, for sure.

Amith Nagarajan: Thank you.

Mallory Mejias: Today, we will be talking about one, leveraging an asset base using artificial intelligence. Then we'll be talking about the falling cost of AI tokens. And finally, we'll wrap up with a talk around generational differences in AI adoption.

So starting with leveraging an asset base with artificial intelligence. Reuters, founded in 1851, is one of the world's largest and most trusted news organizations. It's known for providing unbiased, real time news to media outlets globally. In 2008, Reuters merged with the Canadian company Thomson Corporation to form Thomson Reuters.

Today, Thomson Reuters has transformed far beyond its news agency roots. While still operating Reuters News Agency, 90 percent of Thomas Reuters revenue now comes from data services, not news. The company provides specialized information and software to professionals in law, tax, accounting, and other fields.

Thomson Reuters combines its extensive proprietary content with AI and machine learning to enhance its services. And by focusing on data and technology, the company has effectively reinvented itself for the digital age. It uses its vast database of legal and financial information as a competitive advantage in the AI era.

This strategy demonstrates how companies with unique content or valuable data can adapt to technological changes, and the company stock has outperformed the market, reflecting the effectiveness of this approach. Now, this use case shows that organizations can use their existing strengths combined with new technologies to stay competitive in a rapidly changing business environment.

Amith, you shared this with me, and in this situation, Reuters is leveraging an asset base or a corner resource and using it to drive durable differential returns. Why do you see this as incredibly relevant for associations?

Amith Nagarajan: Well, the call out to durable differential returns which is a key phrase coming from Hamilton Helmer's work used in the Seven Powers of Strategy, which is one of my favorite books. And we're talking about it a lot on the pod and our content. In fact, I just finished recording, uh, over the last few days, all of the content for the AI Learning Hub.

We are doing a new course on AI strategy and it's based on Helmer's Seven Powers and how they apply to AI. Uh, to the association market. And one of the power positions that an organization can be in is to have something called a cornered resource and a corner resource can be intellectual property. It could be a team of people, it could be a patent, it could be a number of different things.

And so, uh, I think that's exactly what you just described is Reuters does have a corner resource. They have a unique proprietary set of content that no one else in the world has. It would be very difficult for someone else to create that equivalent content, if not impossible. Um, And what they're doing is they're finding ways to create more leverage on that same asset base.

So, I saw that article and I said, this is exactly what I want us to share with the association community, because the same asset type exists in many associations, where, you know, you talk to an association leader and you say, Say, Hey, what are the key assets that you guys have? And they'll say, Oh, we have an awesome team and say, cool.

That's great. We have great members. Okay, great. What else? And they'll say, Oh, we have this, all this amazing data. We have all this data on our sector. We have amazing content. It's usually one of the top three or four things, if not even the first thing that an association will cite as the value they provide or the assets they have, depending on who you're talking to and the language they prefer to use.

Uh, but I think it's extremely common to have deep insight and deep content. In the market the association is in however most of the time that content is locked up Not so much physically locked up by intention, but rather the lack of accessibility to that content Um is really what prevents people from gaining the full potential value here reuters has not only made it easy to access, but they're creating derivative products using that intellectual property that you just spoke of that's driven up so much of their enterprise value in recent years

Mallory Mejias: Can you take that a step further and maybe provide a few examples of what kinds of products or services an association could create with their corner resource.

Amith Nagarajan: Well, think about it this way. So imagine I have a benchmarking report. Many associations do that. My benchmarking report, let's say, is about industry stats. Let's say that I'm in the air conditioning space. And so my members are all the people who install and operate and maintain HVAC systems, both for commercial and residential properties across the United States.

I'm sure there's many associations that are in that sector. And let's say the part of what I do is I collect data from my members and I say, I want to know how much you're selling, how optimistic you are about the market, what's slowing down, what's speeding up, what are you happy about, what are you, what are you, what are you scared about, all that kind of market intel.

And let's say I want to capture that data on a fairly regular basis, maybe annually, maybe even more frequently. And so then what do I do? Well, the classical play is the association will create a report and that's great. They'll hire a statistician to go through the survey responses if they don't have them in house.

And they'll produce a beautiful, glossy, printed report that they'll sell for a thousand, two thousand dollars, or they might have like an online version. That's awesome. So that's a core function of a lot of associations, but that report is static. It doesn't really take into account the history of those reports other than the analysis that the person is doing saying, Hey, I'm going to compare this mention where I can report against the 50 preceding it.

So let's say it's annual and you've done it for 50 years. The trend lines have to be directly pulled out by the analyst that's creating that one time report, right? But imagine instead of doing that, or perhaps in addition, the association says, Hey, we're gonna do an AI benchmarking Service, which is dynamic, which is real time and allows our participants, whoever's buying the service to ask questions of the report for the report to kind of like regenerate itself based on their particular areas of interest.

So let's say that I'm an HVAC. HVAC contractor, continuing that example, but I only service mid sized commercial buildings, so I'm particularly interested in how other people in other regions of the United States are feeling about, you know, mid market commercial HVAC as opposed to residential HVAC. So there's a segmentation problem where the general report may be aggregated across all of the sub segments.

Um, I might be interested in slicing it that way. And the report may not have that particular slice available to me because it's a static report. So the value to me as the member is therefore not zero, but it's limited. But if I can get exactly what I want, that's interesting. But to take that even a step further, now what if the member could actually interact with an AI agent saying, Hey, my business is seeing these kinds of challenges or these kinds of opportunities.

What do you think? Right? And let's say the HVAC association not only has that benchmarking data, but also has their full knowledge base. Every article that's ever been written, every conference proceeding that's ever been recorded. And that Uh, that knowledge assistant combined with the benchmarking data can offer bespoke advice back to that individual member, right?

If I'm that owner of a mid sized or a contracting business that services mid sized commercial properties, I can get highly specialized insights and highly specialized Advice which becomes like almost like a one to one consultative service very different level of the value chain, right? So I'm higher up the value chain and associations can monetize that and who better to do that than the association, right?

You know, I can I could start a website like that today, but I don't have the data for it Uh, the HVAC association not only has the corner resource, the data, but they also have brand power. Generally speaking, not all associations do, but they're generally a highly trusted, highly regarded, uh, resource in their space.

So that's why I get really fired up about this. I think this is just an obvious thing for associations to go build these kinds of services and start making a lot of money.

Mallory Mejias: Mm hmm. And do you see these kinds of services as nice to have, good to test out in the future, or as essential to survive? Mm hmm.

Amith Nagarajan: I mean, I think this is Netflix versus blockbuster. I think that this type of service is going to be what people do. And you have to kind of undermine your core business in some ways. So let's say you make a few million bucks a year on benchmarking reports. And this new service basically gets rid of the old report.

You're going to piss off a lot of people who spend their time creating those reports manually. And by the way, that physical report, or the, even the electronic version may still have value, right? It's a good thing. It's not like it's gone, but this new service eclipses the value opportunity, um, significantly.

So you're essentially displacing a current line of business, um, and that's hard politically in some organizations, but the world doesn't care about that. The world just cares about value creation. If you're creating enormous value, people will come to you.

Mallory Mejias: And what you're talking about in terms of Netflix and Blockbuster would be called counter positioning, right? Uh, referencing that book that Amith just mentioned by Hamilton Helmer called, uh, The Seven Powers. So you see this as a path associations will need to take, probably soon, to counter position themselves, so that way they can ensure longevity.

Amith Nagarajan: Absolutely. I mean, you know, counter position yourself is the idea of saying, Hey, we're going to displace our own product, right? Some might say cannibalization of your own product and counter positioning yourself. Maybe there may be an obsolescence path for an existing product or service. In some cases, it isn't necessarily obsolescing a current product or service, but in a lot of instances it is.

So the question I would pose is would you rather be counter positioned by yourself and you build that new business within your association or let somebody else do it? Um, my belief is the association is very well positioned to go after these opportunities from a brand perspective and from a cornered resource viewpoint in terms of having unique data and unique content.

They don't tend to be well positioned to do it in terms of how nimble they are. They tend to have. You know, structure, governance wise, culture wise, that is kind of repellent of risk. And so when you have that kind of culture, it's very difficult to take on, uh, new opportunities like this, but that's, that's the challenge ahead of us because the value equation for the consumer of your service ultimately is what it is.

And you can either choose to play in the new game or not. I mean, this is why Google has been freaking out about generative AI ever since the chat GPT moment in the fall of 22 because Google realized immediately. That there was a step change in value creation in the world. Chat GPT in its first incarnation at the time there was nothing else like it was fundamentally more useful than search alone.

It doesn't replace search entirely. There's still places where you and I Mallory, we use Google, we use perplexity, which is more search oriented, but there's things that you can do that absolutely lower your need for search. In the context of using a generative AI chat solution, there's things that you would have previously gone to Google, run a search, grabbed a few resources off the first page of links, probably, maybe if you're, you know, a little bit more detail oriented than me, the second or third page, and then you would have composed whatever work product you were going to build.

Now you go to ChatGPT or Claude or whatever, And you can create everything in one place. So, essentially it's like a value, the step function and value creation in the world was all of a sudden reached. And that's why 100 million users clocked to chat GPT in the first like 45 days or so. Um, fastest, you know, growing app to 100 million users ever.

And it was, it was not the novelty, it was the value creation. So, fundamentally, when an organization sees Sees that existential risk, they start to invest in it. But I would say that you don't wanna be like Google in the sense that, you know, Google actually invented the underlying technology that powers chat GPT, the transformer, um, you know, AI model.

And so, but then they didn't do a whole lot with it until. They realized somebody else had actually, you know, figured out how to make money from it. So and it's going to displace their core business. I would say the same thing for associations is rather than waiting for someone who isn't as well positioned you as you To be that challenger.

Why don't you be the challenger to yourself?

Mallory Mejias: I can sympathize with the association leader who's listening to this right now and thinking well we sell these benchmarking reports and we bring in revenue and they do really well and it would be very difficult for us to take a temporary hit in revenue to dedicate resources to coming up with some AI offering or service.

Can you speak to that a little bit?

Amith Nagarajan: Sure. I would offer it as a multi tiered offering So let's say we have this hvac benchmarking report industry trends report and we sell that and it's whatever the price is I wouldn't stop doing that. Um, what I would do is i'd say hey You We have a new offering, which is if you pay an extra X dollars, you also get the interactive version of this, where the report is available to you in an AI format where you can ask questions, you can slice and dice it.

So you have kind of that base level of value that the report creates and, and creating a report manually, by the way, and having a static snapshot of like. Your association's view of the industry at that point in time, that is actually fundamentally useful. It's just that by itself, it doesn't create the kind of value that this new generation of technology can help you create.

So I would do it as a two tiered or higher or more tiers, like two tiered offering. We talk about that in our book ascend, you know, where there's like this playbook where you say, look, that base level of value. Maybe it's bundled in a membership. Maybe it's a product you sell. But then there's this tier above it, which no one is expecting from you today.

And you start to offer that as a premium service for additional dollars. Um, and so if there, you know, in some cases there are services and products that you offer that don't make any sense to continue. In the benchmarking example, actually, it's my belief that the association's view of the world is in fact very relevant.

And that's part of what you feed to the AI, is you say, hey, this is our view of the world, of what's going on in the industry right now, and here's the last 50 years or 30 years worth of those views, um, and then you can build on that, and then the AI can add to that, right, with the dynamic nature of what conversational AI can do.

So I think there's a way to kind of, You know, have your cake and eat it too, a little bit. Um, what you have to do is invest a little bit of time and a little bit of money in experimenting around this stuff. Because, you know, like, a simple thing you could do, and this wouldn't take any deep integration, is go to Claude and load up the most recent benchmarking report, or take the last three or four of them as PDFs.

Drop them into Claude and say here are the last four years of benchmarking reports. Your job now is to be a conversational analyst that interacts with my members and provides additional insights based on your knowledge of the world and what specifically is in these reports. Whenever there's conflicts between your training data and these reports, our training data, your training data, It's superseded by our reports.

Our reports are always the source of truth, which is part of what this thing called grounding that you can do in your prompting strategy. And then you can play with that setup in Claude. And you can do the same thing in chat GPT. You could create a custom GPT. Gemini has similar capabilities. Um, and then go talk to the thing and see what it provides.

See exactly what it is that you're able to do with that kind of a setup. And that's kind of a proxy to understanding the value creation that would exist if you took this to scale.

Mallory Mejias: Moving to topic two, the falling cost of AI tokens. One trend that we've discussed recently on the pod is the rapidly falling cost of AI tokens. AI tokens are the basic units of computation for large language models or LLMs like GPT 4, used to process and generate data. text. The pricing of these tokens directly impacts the cost and feasibility of AI applications across industries.

Dr. Andrew Ng, a leading figure in AI, recently highlighted this trend, pointing out that the price of GPT 4 tokens has dropped dramatically from 36 per million tokens at its release in March 2023 to just 4 per million tokens today. This represents a staggering 79 percent increase. price drop per year.

And the price drop is driven by several factors, one of those being competition from open source models. The release of open weight models like Lama 3. 1 allows API providers to offer services without the burden of recouping model development costs. Price reduction is also due to hardware innovations.

Companies like Groq, SambaNova, and Cerebrus are pushing the boundaries of AI computation, enabling faster and more efficient token generation. And then finally, semiconductor advancements. Giants like NVIDIA, Intel, and Qualcomm continue to improve AI specific hardware. The implications of this price drop are far reaching.

It makes AI applications more economically viable across a broader range of use cases. Even complex, token intensive applications that might have been prohibitively expensive before are becoming feasible. For instance, an application using 100 tokens per second continuously would now cost only 1 dollar and 44 cents per hour to run.

The trend is expected to continue with Dr. Andrew Ng predicting further rapid declines in token prices based on this technology from various companies. Roadmaps include improvements in semiconductors, development of smaller yet powerful models, and innovations in inference architectures. For AI companies and developers, this changing landscape presents both opportunities and challenges.

It opens up new possibilities for AI applications, but also require strategic thinking about model selection, cost optimization, and future proofing applications. Amith, in a, in a short summary, why is this so important? We've covered it a few times on the pod, which always alerts to me, this is something important and that we're going to keep talking about, but what do you think?

Amith Nagarajan: Uh, I like that Mallory. It's kind of like giving me a system prompt saying don't go into verbose mode. So I appreciate that. So, um, the bottom line is, is it's accessibility, right? So the cheaper something is, the more it gets used. So it's kind of like the assumption of insatiable. Insatiable demand. So if you think about the growth of the economy, broadly speaking, and we say technology tends to drive the cost of things down and we say, well, what does the curve look like and where are we seeing growth happen?

So if you assume that quantity demanded is fixed, then what's going to happen is, is as price goes down, And the overall demand doesn't increase, you're just going to end up with a surplus. And you're going to have essentially people out of work in the case of labor. Um, in reality what ends up happening is, is that good things tend to grow demand.

If you make tokens a tenth of a cent per million tokens instead of 36, which they're not at yet, but they will be soon. Then essentially it becomes completely free. It's kind of like the mindset around video, uh, on the internet and bandwidth. You know, we're doing this in high res on zoom and people are consuming this on YouTube or on, uh, you know, just audio and we don't care about the incremental cost of the bandwidth we're using because internet bandwidth has basically become free.

You know, we're paying 5, 100 a month, whatever the amount is for, uh, bandwidth at home. That's like, you know, gigabit. fiber kind of connections, and that would have been thousands of dollars just a few years ago. So the same thing is happening here with tokens. And so what that basically boils down to is if A. I. Is almost free in terms of its use, then people will use it more. And the more it's used, the freer it'll get. Right? So if that's a word, the less expensive it'll become. And that will encourage more adoption.

Mallory Mejias: What kinds of applications of AI are pending lower token cost? What do you expect to see come out of this?

Amith Nagarajan: Well, there's a lot of different things. So, from our perspective, across the Blue Cypress family, when we think about the apps we're developing, when we first started building a lot of these generative AI solutions, solutions like BettyBot being one of the first ones. We actually had a pretty significant concern about the cost of token consumption because Betty was consuming a lot of tokens, right?

And in order to be a knowledge assistant that's trained on the world's information for that association, that's a lot of, a lot of content, a lot of tokens to process requests in an effective way. Um, and so that was a major factor in our original economic modeling of the uptake of the solution. But now tokens are very close to free.

So it's just. completely irrelevant. So people who are interested in a solution like that are not worried about their token consumption nearly as much. It's not completely not worried about it, but it's it's changed that the other thing that's happening is, you know, we talk a lot about on the pod about multi agent solutions.

We talk about that in the learning hub and in the book, and that's becoming a really important concept. And what you do in a multi agent solution is you're talking to one or more AI models over and over and over and over again, repetitively, in kind of a loop. Uh, and sometimes you're firing off multiple requests in parallel.

So like Skip, which is our other AI agent that's a conversational agent for data. So Skip is like chat GPT for your database. Um, Skip will execute a bunch of different inference at the same time. So Skip will talk to a bunch of different AI models in parallel and we don't have to care about the cost that much.

It's really, really insignificant now. Whereas again, a year and a half ago, when we started working on Skip, it was a significant concern because Skip is, is what we'd call token heavy. It's, you know, You know, skip is an agent that uses tons of tokens in order to do what he does. So,

Mallory Mejias: For the average listener, I think we hear, you know, a 79 percent price drop is what we said. A 79 percent price drop, and we think that sounds great. Uh, but I think in actuality, we're not exactly sure of if that's a total non issue at this point, but you did say you feel like it's basically free? Are you, are you waiting for Skip and Betty, for the tokens to be even cheaper and cheaper?

Or at this point, is it, does it feel free?

Amith Nagarajan: I mean, it feels pretty close to free relative to the value creation of something like a skip or Betty, you know, we're able to do things that would previously have taken lots of people. And so the idea is, is like, you know, the ROI is based on, Hey, you couldn't hire six people to be data analysts that your association at.

you know, the significant salary data scientists would take, right? Um, versus skip can do it all and do, do it faster and do it frankly, better than any normal data scientists could do. So the value equation is so strong for those products that even if the token consumption cost is. It's something, you know, it's non zero, you know, if it was five, 10, 000 a year of token consumption or something like that, nobody's going to care about that.

Now, I'd rather have that be close to zero. If that was a 50 bill, then it's like, you know, turning on a light bulb in your house, you know, no, you shouldn't waste electricity. But if you left one LED light on that's consuming seven watts, You know, then you might not care as much about it. So, um, that's kind of where we're headed with this stuff.

It's, it's obvious the competition, the scaling, you know, when you, when you scale something as aggressively as this, you know, prices drop from economies of scale. And then in addition to that, if you have deep competitive dynamics in an environment, You're going to see prices drop further, and the AI, the technology itself is already a commodity.

Llama 3. 1 is basically just as smart as GPT 4. 0. Llama 3. 1, 405 billion parameter model, their largest model that was released a couple months ago, is highly competitive with GPT 4. 0. Uh, it's not perfect, and it's not at the level of GPT 4. 0 in a couple of areas, but for most applications you're building, this is good. So there's no cost associated with that model itself. There's just the hardware cost and then the margin that the operator of the hardware needs to have. Um, which and that's, you know, that's arbitrage doubt, right? So it's essentially saying, Hey, there's going to be no real margin in that business, which means for consumers like all of us, it's great because you get more choice, better product and lower cost.

Mallory Mejias: you using a mixture of models in a multi agentic solution like skip?

Amith Nagarajan: Yes, yeah, it's a good idea to have multiple models. At a minimum, you want to have a small model and a large model. So you might say, hey, um, this is a really complex question. Let's fire it off to the large model. But then for a lot of the simpler decision making, you can go to the small model, which is both cheaper, but also faster.

Um, you know, so for example, Lama's smallest model, um, their 8 billion parameter model, They actually call it Llama 3. 1 Instant on the Groq platform. You mentioned Groq, that's G R O Q, uh, with a Q. Um, that, that platform has a really interesting hardware architecture that literally is nearly instant. It inferences at like 800 tokens per second, which for non technical people basically means it's faster than you can possibly read it.

Um, so, uh, ultimately, uh, we already are there and this stuff just keeps getting better. So, from my point of view, using a small model is a really good thing to do for a lot of the day to day basics. And then you build your architecture so that your, your system is smart enough to know when it needs to go to the big model versus the small model.

And that's the simple version of the answer to your question. But you might also use a hybrid where you say, hey, you know what, I want to get the best possible answer. So, what I'm going to do is I'm going to fire off the same request to LLAMA405B, to GPT40, and to Gemini. In parallel. At the same time, I'm going to fire off the same prompt to all three of these best in class models.

Then I'm going to pull back the responses and I'm going to compare them. And I'm going to compare them in all three models. So I'm going to ask all three models to compare the answers and synthesize the best response. So you're kind of like using them all in parallel together. And if tokens are effectively free and these things are really fast, you can do that and have unbelievably higher quality, right?

Because it's like the multi agentic nature of it, but you're using the best of both worlds. Each of these model architectures, different training data, um, more heterogeneous kind of interactions. So that's one way you can approach it.

Mallory Mejias: Interesting. And when you say you are looking at the results of these models, you mean, uh, a supervising AI of sorts?

Amith Nagarajan: Correct. Yeah. No, no human that I know could possibly keep up with that. Um, it's, it's definitely something that another AI has to look at for sure.

Mallory Mejias: Moving to Topic 3, Generational Differences in AI Adoption. A recent DICE survey of U. S. technology professionals revealed significant disparities in generative AI usage across age groups. 40 percent of tech workers aged 18 to 24 use generative AI at least weekly on the job. Nearly half of tech professionals aged 55 or older don't use the technology at all.

This generational disconnect could potentially disrupt enterprise plans for AI adoption. Now, despite the hype surrounding generative AI, its impact on work has been limited so far, at least as it pertains to this survey. Over half of the survey respondents reported that AI has only slightly impacted their work.

Younger IT professionals, 34 and under, are more concerned about AI's impact on their careers compared to older workers, 45 and above. The growing importance of AI is influencing talent strategies as well. More than 80 percent of HR professionals anticipate increased demand for AI professionals in the next six months.

Many businesses are focused on upskilling existing staff to meet AI talent requirements, and we're starting to see companies implement AI adoption roadmaps that include employee training. Several obstacles remain in the path of widespread AI adoption. The generational gap in AI usage could impede adoption efforts, and most companies lack clarity on their staff's existing AI proficiency levels, which makes it difficult to develop effective upskilling strategies.

Amith, as a longtime leader of companies, I will say, is this to be expected?

Amith Nagarajan: Yes.

Mallory Mejias: Hmm. So

Amith Nagarajan: How's that for, how's that for a short

Mallory Mejias: that was too short. You, you can expand on this one. You can expand.

Amith Nagarajan: So every time there's a disruptive technology, it is, it tends to be, uh, overhyped in the short run and underestimated in the long run in terms of its impact. Think about the internet, think about mobile, social, even, um, I would call that more of an app than a, than a framework, but, um, ultimately AI is the Kind of biggest of them all.

And it's being overhyped in some ways. And what that results in is some people who are kind of the incumbents in a given field saying, ah, I'm going to wait for this to pass, or I really don't want to deal with this and it probably won't be that big of a deal or I'll adopt it once it's more mature. Um, but what we're not really taking into account in that is how quickly AI is moving.

It's faster than anything any of us have ever seen. So that's a problem. Um, you know, with internet adoption as rapid as that has been over the last three decades, it's kind of slow compared to what's happening with AI adoption. So I'm pretty worried about this. I think that there's a whole kind of, I mean, there's, these are generalizations.

That's what surveys intentionally do. But if you do have that kind of demographic disparity, um, you're going to have problems because, you know, people are living longer and working longer. And, you know, of course, AI, along with other exponential technologies, is going to further extend those time spans, hopefully.

Um, and so the question is, is what are these people going to do if they don't know how to use AI, and AI is mandatory? You know, there's, there's going to be a major skills gap. So, I fear that people who don't get on this soon enough are going to have a hard time. They're gonna have a hard time learning it, but they're also gonna have a hard time keeping up with it because, you know, I mean, you and I spent a lot of our time talking about this and thinking about this topic, and I think it's still overwhelming.

Um, I feel that way all the time. Um, so, you know, I think people who've done nothing with it, um, are really at a disadvantage already, and they'll be at a deeper disadvantage if they don't jump on the learning path pretty soon.

Mallory Mejias: I think you touched on an interesting point, which is a lot of people see these new emerging technologies popping up and think, Oh, this train will pass, right? It'll just be a wave. I think you and I are both in agreement, right? That AI is not one of those emerging technologies we think will pass in a wave. But how would you recommend as a leader communicating this to your staff or getting that buy in?

I know education is a key piece, but, um, are there ways that you would stress that this is here to stay?

Amith Nagarajan: Yes. I think one quick side note before I answer that question is I think one thing that's different about this technology shift is it feels more personal to a lot of people and both because of all the sci fi lore that's preceded the actual arrival of true AI in our lives, um, for a lot of different reasons, you know, uh, but, but I think people feel a lot more.

With this technology shift, you know, when the internet came out, right. Or the internet become kind of entered the popular consciousness and same thing with mobile technology that wasn't a shift so much in people like having pros and cons to, should you have a smartphone or like maybe some people didn't want a smartphone, they're like, I'm just fine with my flip phone.

I'm just fine without it. But they weren't like these deep emotional issues of like, will this destroy industries? Will this change our way of life? Right. So this is like a very personal topic to a lot of people. And so beyond, waiting for those clouds to pass kind of thing. I think a lot of people are just like, I hate this stuff.

I don't want anything to do with it. It's just terrible. So that emotional reaction, I get that there's a whole group of people that feel that way. And I empathize with that deeply. The flip side of that is the technology doesn't have feelings. And so it's going to be here whether you like it or not. And so as a leader, what I would tell people is very simple.

And I think this is, A mandatory thing for leaders to stand up and do. It's that, um, it is unlikely you by yourself like will be displaced by AI by itself, but it is extremely likely that you will be replaced by someone who's very good at AI if you're not. So it's that simple. Like, if you're not a good user of AI, it's like saying you don't know how to use a computer, you don't know how to use a telephone, you don't know how to use electricity.

If you don't know how to use basic technology in your job, which AI is very quickly going to become a forgotten technology, we're all talking about it right now as if it's like the most important thing, which it probably is, but very soon it's going to be forgotten about and it's just going to be assumed.

And so, if you don't know how to use this stuff, You're gonna have a tough time. Um, I will also say that there is a flip side to that argument, which is that AI is becoming so much smarter that you won't have to be as skilled in using it to be good at using it. So like, for example, we have a whole course on the Sidecar Learning Hub called Prompt Engineering, right?

And it's super popular course. By the way, it's only 24 bucks. For those of you that are interested in diving deep and attaining that skill. It's an awesome course. People love it, but that course is going to be completely irrelevant in three or four years. Um, but for the most technical people, because the systems are getting so much better at metacognition, which is this process of thinking about thinking, right?

So if you go to the thing and you're like, Oh, I just want to solve this problem right now. It'll like spit out some garbage at you if you don't know how to talk to the AI. So you have to train yourself on how to talk to the AI, which is, prompt engineering is a fancy way of saying, learn how to talk to the AI, right?

Um, that's all it is. So, but like the AI is getting smarter and smarter and smarter, so much so that for most people, just like having just a generic conversation like you would with some person, will work with AI systems, perhaps even in a year or two. Now, I'm not suggesting you shouldn't take the Sidecar Prompt Engineering course.

You should, because it's awesome and it's, for now, very useful. But the point is, is that AI is actually making it easier to use AI. That has not been true with other technologies in the past. That being said, I still think there's a major disadvantage for anyone who's not taking the time to get going and understand this stuff.

Um, I think the biggest issue. It was actually gonna be senior leadership. Um, so I tend to see, and this isn't senior leadership, they tend to be further along in the career path, therefore older. So when it comes to the study you referenced, you know, in the second category for the most part. But I think the problem there is there's a tendency in a lot of organizations, particularly in nonprofit land, where the senior leaders say, oh, cool, we have technology people for that.

Um, this is not a technology conversation. This is a strategy conversation. This is a business conversation. If you don't know that power tools exist, that you have a legion of carpenters who use hand tools to do your work. And all of a sudden, if somebody else next door knows that power tools exist and they train their workforce and how to use power tools, they are going to kick your ass.

So what you have to do is learn what the tools are and what their capabilities are. Just like in first topic, we said, Hey, you can take that corner resource of all your data and your content. And turn it into something magically different, right? Such as an unbelievable value creation. Uh, but if you don't know that that tool is capable of doing that, you cannot conceive of the business model, or the strategy to execute, you know, that will result in that durable differential return we started talking about earlier in this episode.

So, uh, Uh, to me, I see that as a major gap, which of course is where I'm focusing, trying to spend time with senior execs and these organizations and convince them that they personally have to learn what this stuff is about, even at the capabilities level, not so much even using the tools. So there's a big gap and it's, I think it's a, it's my biggest concern for the association market and their nonprofit market in general.

Mallory Mejias: I would say this is the, yeah, by far the biggest technology change that I'll see in my professional career, or at least that I've seen thus far, I should say, who knows what's going to happen in the future. So it is hard for me to think about, as you said, a few years from now, us no longer talking about prompt engineering and us no longer talking about AI because it's so ingrained in what we're doing.

Uh, that it won't be worth the topic of conversation. That in itself is hard for me, I think, to, to think about. But going back, talking about the internet, which is an example we use a lot. How can I phrase this? Did, were there internet trainings for company staff? Was that something that was rolled out kind of in the same way that we're talking about?

Amith Nagarajan: Yeah, back in the day, you know, corporations would roll out how to use a web browser training. That was a thing Um, yeah, it was a thing. Of course, there were like, you know Eight websites you could go visit on the internet at the time So, you know, you'd learn how to use yahoo and you'd learn how to use whatever so yeah There was training on that there were there were trainings that people would provide on a variety of things and on the technical side There was like a whole legions of people that would say, Hey, I am an internet developer.

I do web application development versus something else. And most corporate applications were built as windows apps back then. And doing web development was a different thing. Now, when you say software development, people assume that you're referring to distributed internet based development unless you say otherwise.

Right? So. Um, I think it's, I think it's one of the things like it just becomes a tool in the toolkit. Um, the pandemic accelerated this along with technology, but like video conferencing and other remote work style, you know, uh, models like what we're doing right now, you know, it's, it's just something that people have to adapt to.

And, um, in any event, um, the study didn't surprise me. It does concern me. And I think that this is just like other technology shifts. The difference is it's moving so much faster. So we have less time to fix the problem.

Mallory Mejias: Hmm. Well, I hope that if Sidecar had existed back then, we would have been in the business of training associations how to use the internet.

Amith Nagarajan: Totally would have been. Yeah.

Mallory Mejias: Everyone, thanks for tuning in to today's episode 46. We look forward to seeing you next week.

Post by Emilia DiFabrizio
September 5, 2024