1
00:00:00,838 --> 00:00:03,336
[Anna Schultz]: You’re listening to the Real Intelligence podcast
2
00:00:03,794 --> 00:00:08,844
[Anna Schultz]: presented by RXA, a leader in business intelligence and data science consulting services.
3
00:00:09,405 --> 00:00:11,965
[Anna Schultz]: We're here to bring attention to the unique stories,
4
00:00:12,525 --> 00:00:13,004
[Anna Schultz]: perspectives,
5
00:00:13,485 --> 00:00:19,660
[Anna Schultz]: challenges, and success that individuals in the data industry face at every career stage. Welcome to the show.
6
00:00:21,180 --> 00:00:23,680
[Anna Schultz]: Thank you for tuning into the Real Intelligence podcast.
7
00:00:24,153 --> 00:00:30,199
[Anna Schultz]: I'm Anna Schultz, Marketing Manager at RXA, and joining me is the CEO and founder of RXA, Jason Harper.
8
00:00:31,249 --> 00:00:39,302
[Anna Schultz]: Today, we have an extra special episode planned to explore the revolutionary impact of ChatGPT and generative AI on businesses across the globe.
9
00:00:40,433 --> 00:00:45,757
[Anna Schultz]: This game changing technology is redefining the way companies operate, elevating business intelligence,
10
00:00:46,313 --> 00:00:48,617
[Anna Schultz]: and democratizing data science like never before.
11
00:00:49,350 --> 00:00:54,870
[Anna Schultz]: It's not just the next big thing. It's the thing that will shape the way businesses thrive and compete moving forward.
12
00:00:55,670 --> 00:01:02,064
[Anna Schultz]: So, 're thrilled to have an extraordinary lineup of experts with us today who will demystify ChatGPT and generative AI,
13
00:01:02,701 --> 00:01:05,246
[Anna Schultz]: and reveal how you can harness its potential to drive revenue growth,
14
00:01:05,897 --> 00:01:09,318
[Anna Schultz]: enhance decision making, and make data science more accessible than ever.
15
00:01:10,351 --> 00:01:17,527
[Anna Schultz]: Joining us are Mikayla Penna, Sales Account Executive at RXA, and our go-to authority on generative AI’s role in the business landscape.
16
00:01:18,004 --> 00:01:20,255
[Anna Schultz]: And Jacob Newsted, Data Engineer at RXA,
17
00:01:20,631 --> 00:01:23,416
[Anna Schultz]: who will break down the technical wizardry behind ChatGPT.
18
00:01:24,307 --> 00:01:32,040
[Anna Schultz]: We also have the brilliant Matt Schaefer and Megan Foley from Ready Signal, who will share their insights on practical business applications for this groundbreaking technology.
19
00:01:33,010 --> 00:01:41,381
[Anna Schultz]: So, let's jump into the extraordinary world of ChatGPT and generative AI to empower your business with the knowledge to capitalize on this transformative technology.
20
00:01:42,725 --> 00:01:52,532
[Anna Schultz]: To start us off, we know that business leaders are keen to understand how ChatGPT and generative AI can revolutionize their operations, business intelligence and data science.
21
00:01:53,411 --> 00:01:57,325
[Anna Schultz]: But first, I think it's helpful to truly understand - what is generative AI?
22
00:01:58,059 --> 00:02:02,677
[Anna Schultz]: Jacob, can you walk us through the general definition and more specifically how ChatGPT works?
23
00:02:04,110 --> 00:02:04,588
[Jacob Newsted]: Of course.
24
00:02:05,304 --> 00:02:12,329
[Jacob Newsted]: So, I think really the easiest way to go about this is starting just at the words ChatGPT and working backwards.
25
00:02:13,842 --> 00:02:14,342
[Jacob Newsted]: GPT,
26
00:02:14,719 --> 00:02:18,978
[Jacob Newsted]: we can start with that, stands for generative pre-trained transformer.
27
00:02:20,230 --> 00:02:23,280
[Jacob Newsted]: Going a little bit deeper, we're gonna have a lot of steps back,
28
00:02:24,213 --> 00:02:27,757
[Jacob Newsted]: is transformer. It's a specific machine learning architecture
29
00:02:28,134 --> 00:02:31,163
[Jacob Newsted]: that kind of takes a human intuition to looking at data.
30
00:02:31,721 --> 00:02:33,555
[Jacob Newsted]: Transformers have a mechanism
31
00:02:33,889 --> 00:02:37,559
[Jacob Newsted]: called attention, where when you look at, let's say a sentence,
32
00:02:38,596 --> 00:02:40,351
[Jacob Newsted]: like the cow jumped over the moon,
33
00:02:41,163 --> 00:02:48,489
[Jacob Newsted]: certain things like ‘the’ and ‘cow’ are related to each other. And so, the machine learning algorithm kind of pieces these little things together,
34
00:02:49,127 --> 00:02:49,579
[Jacob Newsted]: and it
35
00:02:50,177 --> 00:02:54,557
[Jacob Newsted]: processes data, not just as a collection of words, but as a collection of relations.
36
00:02:55,194 --> 00:03:05,073
[Jacob Newsted]: This is really important when we kind of go together onto a higher-level thing, which is the generative part. So going back to GPT, generative pre-trained,
37
00:03:06,345 --> 00:03:12,753
[Jacob Newsted]: the way GPT models are specifically made is they are input a series of text,
38
00:03:13,552 --> 00:03:14,052
[Jacob Newsted]: and
39
00:03:14,511 --> 00:03:19,460
[Jacob Newsted]: it is cut off at a certain point. And the machine learning model is then asked, okay,
40
00:03:20,174 --> 00:03:20,968
[Jacob Newsted]: input the next word.
41
00:03:21,682 --> 00:03:23,769
[Jacob Newsted]: And it trains itself
42
00:03:24,713 --> 00:03:25,768
[Jacob Newsted]: kind of in an
43
00:03:26,141 --> 00:03:30,372
[Jacob Newsted]: unsupervised way, although that some people might be shouting at me in the background, but
44
00:03:30,824 --> 00:03:33,324
[Jacob Newsted]: it's easy to think about it as
45
00:03:33,856 --> 00:03:37,031
[Jacob Newsted]: unsupervised, as you just give us some text and you say complete this sentence,
46
00:03:37,587 --> 00:03:45,267
[Jacob Newsted]: word by word by word. It forms these relations using attention. And eventually, it generates a complete sentence. Now
47
00:03:45,723 --> 00:03:49,484
[Jacob Newsted]: take a step back and do that over the entire corpus of
48
00:03:50,044 --> 00:03:50,444
[Jacob Newsted]: Wikipedia.
49
00:03:50,924 --> 00:03:51,824
[Jacob Newsted]: Every public
50
00:03:52,364 --> 00:03:53,004
[Jacob Newsted]: source book.
51
00:03:53,564 --> 00:03:57,644
[Jacob Newsted]: And all sorts of text all over the entire world. Multilingual,
52
00:03:58,377 --> 00:04:01,878
[Jacob Newsted]: now even with GPT-4, we have images, audio,
53
00:04:02,356 --> 00:04:02,754
[Jacob Newsted]: video,
54
00:04:03,231 --> 00:04:05,164
[Jacob Newsted]: all sorts of things. And
55
00:04:05,714 --> 00:04:08,850
[Jacob Newsted]: it generates these kinds of very general
56
00:04:09,788 --> 00:04:12,025
[Jacob Newsted]: ideas of relationships between words,
57
00:04:12,679 --> 00:04:13,179
[Jacob Newsted]: sometimes
58
00:04:13,878 --> 00:04:16,914
[Jacob Newsted]: images, and it's able to generate piece by piece,
59
00:04:17,952 --> 00:04:18,852
[Jacob Newsted]: some sort of
60
00:04:19,231 --> 00:04:19,731
[Jacob Newsted]: predictive...
61
00:04:20,283 --> 00:04:21,795
[Jacob Newsted]: this is what should come next.
62
00:04:23,387 --> 00:04:33,188
[Jacob Newsted]: That's what GPT is, GPT two, GPT three, GPT four, we're gonna keep going and they're gonna probably keep the same architecture with small changes,
63
00:04:34,063 --> 00:04:36,051
[Jacob Newsted]: which would be a completely different topic.
64
00:04:36,544 --> 00:04:39,972
[Jacob Newsted]: But then we really have to talk about the big game changer here, which is the chat part.
65
00:04:40,769 --> 00:04:41,269
[Jacob Newsted]: Chat
66
00:04:41,646 --> 00:04:49,969
[Jacob Newsted]: actually comes from the idea that rather than training a model just on predicting the next token or next word depending on what model you're using,
67
00:04:51,330 --> 00:04:59,992
[Jacob Newsted]: we score how well it outputs things. So if you've ever used GPT before, you know that it can come up with very different
68
00:05:00,929 --> 00:05:02,126
[Jacob Newsted]: outputs to the same prompt.
69
00:05:02,938 --> 00:05:05,092
[Jacob Newsted]: Depending on, like parameters and stuff like that.
70
00:05:06,049 --> 00:05:11,074
[Jacob Newsted]: It's basically how they sample the next word, so it doesn't always do the same thing.
71
00:05:11,712 --> 00:05:17,479
[Jacob Newsted]: So they leverage that. They say, okay, GPT come up with five different statements to this one prompt.
72
00:05:18,278 --> 00:05:19,556
[Jacob Newsted]: And then a human says,
73
00:05:20,195 --> 00:05:20,355
[Jacob Newsted]: okay,
74
00:05:21,249 --> 00:05:25,954
[Jacob Newsted]: I'm gonna rank these one to five, on how pleasing it is for me, the human, to read.
75
00:05:26,751 --> 00:05:37,782
[Jacob Newsted]: And then it updates itself based on that over and over and over. And they actually automate it. They cheat... They actually train a model to reward it for them so that they can automatically do this, but
76
00:05:38,318 --> 00:05:48,173
[Jacob Newsted]: more or less. It's a human ranking things from one to five and updating itself based on that. So that's where we go from just predicting the next word or token,
77
00:05:48,630 --> 00:05:57,039
[Jacob Newsted]: to making it human-pleasing and easy to interface with. And that's why ChatGPT is so big. Because it takes this kind of
78
00:05:57,412 --> 00:06:04,830
[Jacob Newsted]: scientific approach to predicting the next word or token and gives it a sort of human touch, something that's easier for us to
79
00:06:05,202 --> 00:06:10,249
[Jacob Newsted]: interact with. And that's why this has been such a big deal, is that now we have this
80
00:06:11,089 --> 00:06:18,004
[Jacob Newsted]: almost human-like, I hesitate to say that, but almost human-like model that anybody can interact with
81
00:06:18,381 --> 00:06:20,554
[Jacob Newsted]: with very little knowledge about
82
00:06:21,409 --> 00:06:23,959
[Jacob Newsted]: what GPT expects. It's what a human expects now.
83
00:06:25,408 --> 00:06:28,112
[Jason Harper]: I think one of the things you said in there too is just...
84
00:06:28,987 --> 00:06:39,921
[Jason Harper]: I don't know, like the secret sauce I guess, is that there's a human in the loop in this process, that human ranking. I know there's some nuance to how that's actually occurring, but it's that that human ranking where it's still
85
00:06:40,694 --> 00:06:42,387
[Jason Harper]: utilizing this human in the loop
86
00:06:42,699 --> 00:06:58,887
[Jason Harper]: for training that has completely made this game changing. Right? So when you're using it, it does, it does have that feeling of, like, this is, you know, a person. I thank it. I'm just, I feel compelled to say, oh, great, thank you. Like, give it praise and, like, work with it in this, like, positive, like, environment,
87
00:06:59,299 --> 00:07:06,803
[Jason Harper]: and it rewards me back with, also like praise like, oh, that's great. Yeah. Great idea. Oh, this would be really fun to work on. Good luck with your project.
88
00:07:07,192 --> 00:07:13,458
[Jason Harper]: Like, it's really interesting. Like those little nuances to that that really make it feel like I'm working with someone.
89
00:07:14,014 --> 00:07:19,859
[Jason Harper]: Yeah. I mean, maybe that is creepy and is bad, but I do feel like I'm working with someone when I'm in there playing with this tool.
90
00:07:20,739 --> 00:07:41,417
[Jacob Newsted]: Yeah. And that's like one of the big breakthroughs. There's a couple of, like, big, kind of, obstacles we have to go through, but just interacting with these models. I mean, it's you. It's anybody. Like, I'm the nerdy dude that kind of is like, oh, hey, numbers cool. But everybody else is interacting with these models. So it's very huge that people get this kind of feedback. It makes a big difference.
91
00:07:44,834 --> 00:07:47,138
[Anna Schultz]: Absolutely. Thank you, Jacob, for walking us through that.
92
00:07:48,266 --> 00:07:49,721
[Anna Schultz]: Now that we have this
93
00:07:50,095 --> 00:07:57,411
[Anna Schultz]: general understanding of the program. I think it's important that we pivot and discuss how businesses can actually apply ChatGPT today.
94
00:07:58,300 --> 00:08:06,320
[Anna Schultz]: So, a question for the larger group, maybe Mikayla, Matt and Megan. Can you walk us through how you've seen ChatGPT be used to optimize business operations?
95
00:08:10,305 --> 00:08:12,710
[Matt Schaefer]: I'll go first. So I think that,
96
00:08:13,798 --> 00:08:24,299
[Matt Schaefer]: to Jacob's comment. The way you summarized it, it was fantastic, especially for someone on this call that's not a data science practitioner. I spent my entire career in the world of data and analytics, but
97
00:08:24,898 --> 00:08:37,128
[Matt Schaefer]: I think that what I see here is a fundamental game changer making, kind of, this world of data science much more approachable to the masses. But for those that can really harness this as a utility in its current form,
98
00:08:37,767 --> 00:08:39,146
[Matt Schaefer]: I think it can be a fundamental
99
00:08:40,018 --> 00:08:43,540
[Matt Schaefer]: game changer from an efficiency standpoint and things like content
100
00:08:44,074 --> 00:08:44,313
[Matt Schaefer]: creation.
101
00:08:45,029 --> 00:08:47,972
[Matt Schaefer]: It's things as simple as Linkedin posts, blogs,
102
00:08:48,702 --> 00:08:49,599
[Matt Schaefer]: email prospecting,
103
00:08:50,132 --> 00:08:55,932
[Matt Schaefer]: you know, I think that there's a ton of application today, and it's only in those types of realms, at least from my vantage point,
104
00:08:58,017 --> 00:09:00,192
[Matt Schaefer]: that we've only scratched the surface. And
105
00:09:00,968 --> 00:09:06,971
[Matt Schaefer]: it's amazing. It's a little bit terrifying, but extremely exciting on, I think, what this will do
106
00:09:08,081 --> 00:09:12,308
[Matt Schaefer]: and how this will reshape how, you know, businesses run and operate and scale.
107
00:09:13,983 --> 00:09:28,200
[Mikayla Penna]: Awesome, Matt. To add to that, for me, being a saleswoman my entire career. For me, what's exciting and what I've seen a little sliver of to help me so far is just being more efficient and more productive in my role specifically.
107
00:09:28,350 --> 00:09:41,500
[Mikayla Penna]: Like, I'm able to give GPT a few prompts and have it spit out my notes from client calls, so I can, you know, put together contracts and follow up on emails to, you know, eventually get deals across the finish line.
107
00:09:41,600 --> 00:09:54,854
[Mikayla Penna]: And for me, spending less time on what I like to call my admin tasks in my role, and more time building deep connections with clients, is a game changer, and I'm just excited for the future to see
108
00:09:55,310 --> 00:09:56,425
[Mikayla Penna]: where this could go from there.
109
00:09:57,780 --> 00:10:08,226
[Megan Foley]: Yeah. I totally agree with you, Mikayla. I totally feel that every single day, and it's really cool being in a small company, that your small team can do the work of like six to ten people now.
110
00:10:08,956 --> 00:10:16,685
[Megan Foley]: And it's not even just small companies that are, like, really seeing the effects. If anybody from, like the big, like Whole Foods and the Amazons are also seeing machine learning algorithms
111
00:10:17,233 --> 00:10:21,389
[Megan Foley]: really just take their processes and run with it. There's so many cool new
112
00:10:21,843 --> 00:10:28,096
[Megan Foley]: outputs that really run on chat. And it's really cool to see how they're working their way into your everyday sales processes.
113
00:10:28,475 --> 00:10:33,531
[Megan Foley]: Anything from, like a copilot that's running on like, Microsoft suite and just helping you with Linkedin prospecting,
114
00:10:34,069 --> 00:10:44,306
[Megan Foley]: to really just writing LinkedIn posts, doing marketing, anything from the visual aspect to the copy itself. And it's really just making it a lot easier for the little guys to compete too.
115
00:10:45,499 --> 00:10:46,931
[Mikayla Penna]: Absolutely Megan. Totally agree.
116
00:10:48,044 --> 00:10:59,477
[Jason Harper]: I wonder, so on that note, like so, obviously Microsoft made a big investment into OpenAI in particular and Microsoft is incorporating it into Github and their tools, but I hadn't really
117
00:11:00,509 --> 00:11:20,371
[Jason Harper]: pieced it together that they also own Linkedin, right? So, like, are we seeing, are we seeing that level of integration occurring within LinkedIn, like you are? Can you talk about that? I don't know that I've actually seen that yet.
118
00:11:14,087 --> 00:11:33,449
[Megan Foley]: Yeah. So, I would say that there is probably about like, six or seven tools that are trying to be the next big thing. And really what it comes down to is, they're still playing around, still trying to figure out who,, like the best one is and who's going to get that Microsoft kinda tie in. But what it comes down to is, all these tools are scraping somebody's page,
119
00:11:33,759 --> 00:11:41,881
[Megan Foley]: figuring out what they say their sales are, their skills, all these different things, scraping it, putting together a blurb, and you can explain what your company does,
120
00:11:42,772 --> 00:11:51,036
[Megan Foley]: and then what's like your main key product takeaways, and it will scrape their page and see how you match up to make a tailored message to them for you.
121
00:11:52,248 --> 00:11:55,944
[Mikayla Penna]: And to add to that, Megan, that's something you know that I feel like
122
00:11:56,882 --> 00:12:07,525
[Mikayla Penna]: is a huge game changer, because spending time tailoring a small social media, like LinkedIn message for each persona you're talking to is incredibly time consuming.
123
00:12:08,018 --> 00:12:10,347
[Mikayla Penna]: And I used to, you know, spend
124
00:12:10,880 --> 00:12:15,705
[Mikayla Penna]: three, four hours of a block in my day to do this. And now I could do this in
125
00:12:16,064 --> 00:12:24,215
[Mikayla Penna]: thirty minutes, which is amazing, and then get back to the work that's meaningful, and at the end of the day, produces revenue. So, I love it.
126
00:12:26,470 --> 00:12:32,170
[Jason Harper]: That's fantastic. I love it too from that perspective for sure. I would say
127
00:12:32,630 --> 00:12:39,105
[Jason Harper]: one of the, when... In thinking about the comment about creating the posts and that. Like, I just started using Microsoft Designer,
128
00:12:39,822 --> 00:12:41,256
[Jason Harper]: which I think runs on DALL-E,
129
00:12:41,910 --> 00:12:42,230
[Jason Harper]: DALL-E.
130
00:12:43,350 --> 00:12:48,490
[Jason Harper]: And I don't know what y'alls experience have been with that, but I started playing with it this weekend and found it to be,
131
00:12:49,443 --> 00:12:56,200
[Jason Harper]: I think the technical term is rubbish. Like, I didn't find a lot of success in its ability to actually create something I could use.
131
00:12:56,300 --> 00:13:04,924
[Jason Harper]: It was entertaining and like, I enjoyed the results from, like a, you know, made me giggle, but I didn't find anything from the image perspective that was actually
132
00:13:05,394 --> 00:13:05,873
[Jason Harper]: useful.
133
00:13:06,272 --> 00:13:10,604
[Jason Harper]: Curious what, if you guys have different experiences with that?
134
00:13:10,983 --> 00:13:11,222
[Matt Schaefer]: Yeah. Not as advertised.
135
00:13:11,861 --> 00:13:21,063
[Matt Schaefer]: Sorry. Not as advertised, I think is what the, the way that I would sum it up. I think that there could be a ton of really interesting applications for it in future versions of this,
136
00:13:22,022 --> 00:13:25,471
[Matt Schaefer]: but I would agree, not as advertised, underwhelming a little bit.
137
00:13:26,428 --> 00:13:28,262
[Megan Foley]: Yeah. And to kinda add to Matt's point,
138
00:13:28,980 --> 00:13:35,056
[Megan Foley]: I've even expanded it outside of the Microsoft Designer to other platforms. So, Canva has their own called Canva Magic.
139
00:13:35,693 --> 00:13:43,123
[Megan Foley]: And it seems like this really cool feature that's it's like, oh wow. It's gonna replace Adobe Photoshop, all this sort of stuff, and even Adobe has their own with Firefly.
140
00:13:43,832 --> 00:13:51,400
[Megan Foley]: And really, they are on definitely, like, the beginning cusp. That, I feel like, if Chat didn't get released when it got released, they would have been a couple years out from it.
140
00:13:51,500 --> 00:13:58,249
[Megan Foley]: And really, I felt like they jumped on the bandwagon and threw it out here, and it really shows that humans right now kind of have that,
141
00:13:58,625 --> 00:14:04,287
[Megan Foley]: that cap space pretty well,, like managed in that it can't produce the same things that I feel like people can right now.
142
00:14:07,704 --> 00:14:10,486
[Megan Foley]: But we'll see in a couple of years. Maybe it will be completely different.
143
00:14:10,979 --> 00:14:12,015
[Megan Foley]: It's definitely the beginning.
144
00:14:12,972 --> 00:14:19,700
[Jason Harper]: Yeah. Yeah. I think it's been underwhelming and I think you're right. They just jump the gun. I don't know why they're trying to push so fast when it's just...
144
00:14:19,800 --> 00:14:28,286
[Jason Harper]: So... You can’t help but compare the quality of output that we're getting from, like GPT-4 and think, like, how amazing this is. And then my expectations
145
00:14:28,940 --> 00:14:35,040
[Jason Harper]: have been completely transformed, for what I expect from any of these. So, when I get something back from them that
146
00:14:35,500 --> 00:14:40,140
[Jason Harper]: maybe a year ago, I would have thought, wow. This is really cool. Look at how good they did. It's like, it's...
147
00:14:40,857 --> 00:14:45,555
[Jason Harper]: My expectations are completely different, and it's just, yeah, rubbish, I will stand by that.
148
00:14:46,446 --> 00:14:49,730
[Mikayla Penna]: Yeah. I feel like the bar has been set like super high with
149
00:14:50,344 --> 00:14:50,844
[Mikayla Penna]: every
150
00:14:51,219 --> 00:14:55,553
[Mikayla Penna]: iteration of GPT. And now it's like, the race to the top to get that visual
151
00:14:56,010 --> 00:14:59,939
[Mikayla Penna]: generative AI for several companies, not just OpenAI and
152
00:15:00,954 --> 00:15:03,107
[Mikayla Penna]: I wanna see who gets there, like, who can do it.
153
00:15:04,237 --> 00:15:09,958
[Jason Harper]: Yeah. I think the only place I've seen it successfully applied has been in that sort of, like, using...
154
00:15:11,404 --> 00:15:18,157
[Jason Harper]: Stable Diffusion, where you're really training your own, and you're really, you're feeding it quite a lot of your own bespoke image
155
00:15:19,285 --> 00:15:28,767
[Jason Harper]: data, if you will, then it can actually be really powerful. But it's, it's just not starting from what feels like scratch with DALL-E or these other, these other sources.
156
00:15:30,831 --> 00:15:33,054
[Jacob Newsted]: I kinda feel like a lot of these models,
157
00:15:33,848 --> 00:15:48,968
[Jacob Newsted]: nothing was tailored at first. We we had a lot of stuff that was like, oh, I can generate images based on ninety billion different images on the Internet. But like, to actually put it into a product and say, hey, you can use it with our product x is a completely different
158
00:15:49,426 --> 00:15:49,926
[Jacob Newsted]: expectation.
159
00:15:50,385 --> 00:15:54,159
[Jacob Newsted]: Because I can ask for a sunny meadow view
160
00:15:54,930 --> 00:16:04,895
[Jacob Newsted]: in the Appalachia or something like that. And Stable Diffusion can come up with something really good. Maybe even DALL-E, I will admit as well that DALL-E 2 is pretty rough.
161
00:16:06,651 --> 00:16:17,821
[Jacob Newsted]: But I think the problem is, is a lot of people just started throwing spaghetti at the wall, and it's not tailored to any one product yet. I think it's gonna be a little bit, but I am really looking forward to seeing, like,
162
00:16:18,473 --> 00:16:20,621
[Jacob Newsted]: how they navigate that. It's gonna be kind of interesting.
163
00:16:23,168 --> 00:16:27,680
[Matt Schaefer]: It's an awareness and an education opportunity right now. Right? I mean, because the,
164
00:16:28,118 --> 00:16:30,133
[Matt Schaefer]: I think about this like, who the hell thought
165
00:16:30,829 --> 00:16:47,548
[Matt Schaefer]: YouTube influencer would be a thing, you know, a decade plus ago. And now you've got, you know, new opportunity for to be like the maestro of prompts and like, what it can be produced, that's gonna be a pretty desirable skill set here in not so,
166
00:16:47,925 --> 00:16:57,186
[Matt Schaefer]: in not a very long period of time. You know, I think that companies are already thinking about this both from an education and enablement, but who I'm hiring for and the job recs that we're producing,
167
00:16:57,981 --> 00:16:58,958
[Matt Schaefer]: it's just in,
168
00:16:59,332 --> 00:17:01,740
[Matt Schaefer]: it's influencing everything. I think it's super fascinating.
169
00:17:02,208 --> 00:17:05,939
[Matt Schaefer]: Right? And it's... Those who don't embrace are the ones that are gonna get left behind.
170
00:17:06,653 --> 00:17:09,296
[Jason Harper]: So, it's funny, I was having this conversation at
171
00:17:09,907 --> 00:17:11,774
[Jason Harper]: a birthday party. On Saturday
172
00:17:12,153 --> 00:17:15,270
[Jason Harper]: for, with a bunch of ten-year-olds and the parents. And so, we were sitting around talking.
173
00:17:15,830 --> 00:17:20,565
[Jason Harper]: And of course, since I was there, I brought this up and we were talking about GPT and AI and stuff. And
174
00:17:20,959 --> 00:17:30,400
[Jason Harper]: talking about that, in particular, Matt, the usefulness of the skill set of writing a prompt and how important of a skill set that is, to be able to actually write really effective prompts
174
00:17:30,500 --> 00:17:43,831
[Jason Harper]: and looking for that as an employer and things like that. And for me, like, I'm, I may be wrong about this, but I feel like that's gonna be an insanely useful skill set for, like, twelve months. And then it's gonna actually be using
175
00:17:44,384 --> 00:17:54,320
[Jason Harper]: GPT and GPT-like solutions to actually write the prompts. And so, your ability to create these really well constructed prompts and the nuance that goes into that is critical, but it's super short lived
176
00:17:54,620 --> 00:18:02,544
[Jason Harper]: because it's really about create... Like, it really is more about pulling what you're looking for and stuff. But so, I really, this whole space, I guess, I just say that it's a...
177
00:18:03,738 --> 00:18:13,538
[Jason Harper]: I don't know where this... I guess, from a skill set needed to take advantage of this, I don't I don't have my head around exactly what the right skill set is in the long or even medium term.
178
00:18:15,208 --> 00:18:17,377
[Mikayla Penna]: Yeah. Right now, I see it as
179
00:18:18,403 --> 00:18:23,680
[Mikayla Penna]: in the short-term, companies hiring for roles like AI Project Manager or
180
00:18:24,453 --> 00:18:26,068
[Mikayla Penna]: AI data science
181
00:18:26,854 --> 00:18:27,354
[Mikayla Penna]: manager,
182
00:18:27,886 --> 00:18:34,445
[Mikayla Penna]: people that can, you know, have the skills to prompt it and have worked on that and are ahead of the pack in that game and then
183
00:18:35,042 --> 00:18:36,954
[Mikayla Penna]: eventually, from what you were saying Jason,
184
00:18:37,432 --> 00:18:38,467
[Mikayla Penna]: pivoting to...
185
00:18:39,105 --> 00:18:45,115
[Mikayla Penna]: Now I run the whole department within my company that's in charge of AI and I managed these tools that we're using
186
00:18:45,649 --> 00:18:46,286
[Mikayla Penna]: for AI.
187
00:18:46,922 --> 00:18:50,875
[Mikayla Penna]: And I think that's really exciting, and also when I talk to like, my friends
188
00:18:51,233 --> 00:19:04,896
[Mikayla Penna]: and my family that are outside of, like, this whole tech space we live in. They don't even know what ChatGPT is yet. Right? They're like, Oh, it's just, like scary robot stuff. That's gonna take over the world. Like, no offense mom and dad, but their
189
00:19:05,274 --> 00:19:15,668
[Mikayla Penna]: their generation just doesn't get it. And I just think it's really important to know, like, if you're not a robot because I'm not yet. Right? Is you're like, what about, like, soft skills and empathy,
190
00:19:16,059 --> 00:19:22,591
[Mikayla Penna]: what about all these things? And I don't think you should be scared of that. Like, job skills such as, like, communication and empathy and
191
00:19:23,854 --> 00:19:26,087
[Mikayla Penna]: having a deep knowledge of, like, who you're talking to.
192
00:19:27,442 --> 00:19:32,700
[Mikayla Penna]: Now, where we stand and in my opinion into the future, that's not something that AI is going to replace.
192
00:19:32,800 --> 00:19:45,895
[Mikayla Penna]: But if you have the hard skills right of understanding AI and understanding how to prompt it or even how it works like Jason... or not... I'm sorry. Not Jason. Delete that, Anna. Like Jacob explained at the beginning of this call.
193
00:19:46,930 --> 00:19:58,470
[Mikayla Penna]: I think there'll be huge skills in the job market and I know everyone in this call that I'm on with today, like, we're excited that we're into it and we're talking about it and I think when the greater public
194
00:19:58,863 --> 00:20:14,820
[Mikayla Penna]: you know, finally embraces AI in general, whether that be DALL-E or ChatGPT. I think we'll be ahead of the game. We'll be ahead of the curve, you know, knowing about something and being excited about it and teaching yourself knowledge before the general public goes wild,
195
00:20:15,635 --> 00:20:18,515
[Mikayla Penna]: you know, makes me really happy and excited for where I'm at.
196
00:20:19,475 --> 00:20:30,900
[Jason Harper]: I think that's a good point. I sometimes forget too. I forget that people don't know what this is. Like, and I'm reminded every now and again, like, at this birthday party. I was like, oh, you've never? I'm trying to describe it as though everyone knows.
196
00:20:30,950 --> 00:20:38,900
[Jason Harper]: I was like, you haven't seen this? I'm like, oh, like, pull up my phone. I'm showing them for the first time this thing that I'm, like, yeah. I don't know. I thought everybody used this.
196
00:20:38,950 --> 00:20:49,113
[Jason Harper]: I'm confused by this conversation. Give me a moment to reset my brain to back to reality because not everybody is using this, you know, all day every day and as, you know, it's like
197
00:20:49,424 --> 00:21:00,847
[Jason Harper]: you know, it is. And I think too it, it's imperfect and there's lots of issues with it. So, it makes sense that most people haven't fully adopted this because it's wrong a lot and you have to like, be willing to deal with this thing that's, you know,
198
00:21:01,564 --> 00:21:09,295
[Jason Harper]: It sounds like it's super smart and super right, and you can trust it for everything, but it's you know, it's not. It's still so early. I sometimes forget that.
199
00:21:10,249 --> 00:21:17,679
[Mikayla Penna]: Exactly. And I don't think we'll see, like, full, like, public use of this until it is trained to be more accurate.
200
00:21:18,213 --> 00:21:18,452
[Mikayla Penna]: Right?
201
00:21:19,565 --> 00:21:19,963
[Jason Harper]: Mh.
202
00:21:21,412 --> 00:21:25,398
[Megan Foley]: I feel like the biggest conversation at kids birthday parties with it is about cheating, probably? Right, Jason?
203
00:21:26,993 --> 00:21:31,332
[Jason Harper]: That does come up. That does certainly come up. Yeah. I mean, we're seeing that I guess,
204
00:21:32,025 --> 00:21:38,685
[Jason Harper]: especially at the college level, like, being able to... Just everyone seems to be using it to write everything. So, I don't know how college is really gonna
205
00:21:39,121 --> 00:21:47,520
[Jason Harper]: react to this. I feel like it is, sort of like a, this is a lean in opportunity to figure out how to incorporate this into everybody's, you know, educational experience versus
205
00:21:47,650 --> 00:21:57,520
[Jason Harper]: you know, I'm already seeing, like, places trying to put the guard walls up to, like, combat and fight the use of this technology and whatever. But I feel like that's a fool errand. But yes, I could talk about that separately.
206
00:22:00,012 --> 00:22:05,765
[Megan Foley]: The willingness to learn and the adaptability really. I feel like that's the soft skills needed to really take on the challenge.
207
00:22:06,340 --> 00:22:17,483
[Mikayla Penna]: Yeah. And I think there's so many, like, different businesses that can be made out of this, whether it's a business that detects if someone wrote a paper or took a test with AI help or,
208
00:22:18,890 --> 00:22:23,919
[Mikayla Penna]: you know, hospitals using AI to make better informed decisions to
209
00:22:25,103 --> 00:22:31,326
[Mikayla Penna]: go the way of human error. There's so many things that get my wheels spinning, but I'm like, oh, my gosh. So many new businesses. Not even just in tech are gonna come out of this era, I feel like.
210
00:22:31,953 --> 00:22:40,030
[Jason Harper]: Well, I think Mikayla too as you're talking about that. What are the implications of this?
211
00:22:40,865 --> 00:22:41,762
[Jason Harper]: I was talking to
212
00:22:42,137 --> 00:22:54,862
[Jason Harper]: a friend of mine last week, sort of about some of the implications of some of the visual and audio components of this and its ability to kinda like, fake things in real time. I think a corollary some of the results of this too will be sort of like,
213
00:22:55,813 --> 00:23:02,748
[Jason Harper]: kinda what the opposite of scaling is, descaling I think is the wrong word, but, you know, bringing things back where you're not able to do...
214
00:23:03,385 --> 00:23:19,995
[Jason Harper]: We're gonna rely more on being in person and being in real life with people and being able to... Like, it's gonna place more value on being in person and talking to people or going to the park, or doing, like, these sort of things, like having conversations. That because the amount of, you know, curated
215
00:23:20,608 --> 00:23:28,500
[Jason Harper]: automated video content where you're talking to things or interacting with things that aren't necessarily actual human on the other side. It's gonna make phone calls feel less real. Right?
215
00:23:28,600 --> 00:23:47,005
[Jason Harper]: So, like, you know, and sort of I've I've seen and felt that we've changed from, like, having phone calls feeling very real to then having really video calls now is really my normal default state for a lot of conversations that I'm having and that sort of the thing. And I think there's gonna be another change where it's like, I just wanna be in person to actually see you and
216
00:23:47,385 --> 00:24:05,109
[Jason Harper]: be around you and in your space, and I think same thing with colleges and testing and things like that. Like, and it you're not... It's gonna reduce our ability to scale and do these big broad things and actually focus on no, like, more interpersonal reactions or relationships with teachers, with peers and things like that, which I think...
217
00:24:05,719 --> 00:24:11,229
[Jason Harper]: That I feel is, it may slow some things down, but be a true net benefit to help people interact with one another.
218
00:24:12,106 --> 00:24:22,945
[Matt Schaefer]: Oh my gosh. I... That... Wouldn't that be an amazing byproduct at this moment in time. I mean, many of us have young young kids. We talk about screen time all the time in my house and, like, the video games and, like, how do I...
219
00:24:23,838 --> 00:24:34,220
[Matt Schaefer]: It's like to an unhealthy level. And I mean, that's an interesting perspective of the future that this actually drives both innovation and more human connection than anything else.
220
00:24:35,334 --> 00:24:38,039
[Matt Schaefer]: I'd... My chips are on that one, Jason. Yeah.
221
00:24:39,883 --> 00:24:41,020
[Jason Harper]: Yeah.
222
00:24:41,793 --> 00:24:52,949
[Anna Schultz]: Yeah. Absolutely. Wouldn't that be a good, a good outcome of all this? And it's funny. I think a lot of people think it's, you know, the opposite. It's driving us towards this AI, towards this more technology, but maybe it will have the opposite effect.
223
00:24:54,779 --> 00:24:56,871
[Anna Schultz]: So, we've gotten a lot into the
224
00:24:57,818 --> 00:25:00,921
[Anna Schultz]: aspects of ChatGPT helping with business operations,
225
00:25:01,318 --> 00:25:04,182
[Anna Schultz]: maybe image generation, content creation, things like that.
226
00:25:04,914 --> 00:25:12,575
[Anna Schultz]: One of the other areas I know we've all seen great potential for generative AI is in enhancing business intelligence and data driven decision making.
227
00:25:13,306 --> 00:25:17,146
[Anna Schultz]: So, can we kind of steer the conversation and talk about how ChatGPT,
228
00:25:17,520 --> 00:25:22,052
[Anna Schultz]: or generative AI in general, can help in generating actionable insights from data?
229
00:25:25,089 --> 00:25:27,338
[Mikayla Penna]: Sure. I can speak to just something that's
230
00:25:28,111 --> 00:25:32,428
[Mikayla Penna]: kinda driving me and keeping me up at night, is how businesses will use this
231
00:25:32,739 --> 00:25:43,320
[Mikayla Penna]: to be more custom to their customers. Again, salesgirl talking here. So I'm thinking about the money always. And for me, it's like, okay. In today's age, say I wanna go and buy
232
00:25:43,777 --> 00:25:48,587
[Mikayla Penna]: a new pair of sweatpants. Right? I'm like, okay, I'm gonna go to like, my five go-to stores.
233
00:25:49,377 --> 00:25:50,726
[Mikayla Penna]: Price is gonna be a factor,
234
00:25:51,202 --> 00:26:03,700
[Mikayla Penna]: the type of pants I'm gonna look for are gonna be a factor. And then I kind of, like, price shop across the Internet. Right? And now I feel like if companies can incorporate this, let's like say a big one like a Target, for example.
234
00:26:03,800 --> 00:26:17,500
[Mikayla Penna]: If I can go on to Target and have GPT like white labeled on their site and ask it questions about what I'm trying to find, and it spits out what they sell to me right away without having to, like, go through their search and do all this.
234
00:26:17,600 --> 00:26:31,610
[Mikayla Penna]: I'm gonna buy it right then instead of just putting it in my shopping cart and waiting. Half the time, I'll put thousands of dollars of clothes in my shopping cart, and I'll never buy them. It's just like window shopping on the Internet. And I feel like it could lead to making faster
235
00:26:32,162 --> 00:26:32,662
[Mikayla Penna]: customer
236
00:26:33,193 --> 00:26:43,686
[Mikayla Penna]: decisions, which, you know, lead to more revenue for businesses, whether it's small and large, and that's something I think about all the time on all levels of business, specifically retail.
237
00:26:48,001 --> 00:26:48,980
[Anna Schultz]: Yeah. Absolutely.
238
00:26:50,235 --> 00:26:54,385
[Matt Schaefer]: I think from my perspective, like, as you think about the world of BI, data science,
239
00:26:54,955 --> 00:26:57,649
[Matt Schaefer]: analytics. I mean, it still is a very kind of like...
240
00:26:59,392 --> 00:27:06,913
[Matt Schaefer]: I guess, it's a world of, that's kind of under the wrapper of super technical and I think that this, you know, kind of helps to... I hate saying demystify,
241
00:27:07,291 --> 00:27:09,946
[Matt Schaefer]: but I think really makes that world much more approachable.
242
00:27:10,498 --> 00:27:13,788
[Matt Schaefer]: And some things that we're thinking about like at Ready Signal is
243
00:27:14,245 --> 00:27:15,223
[Matt Schaefer]: playing in this world of
244
00:27:15,998 --> 00:27:17,911
[Matt Schaefer]: external data, leading indicators,
245
00:27:18,643 --> 00:27:24,394
[Matt Schaefer]: purpose-built forecasts, being able to provide these in a way that they're both consumable, but also interpretable
246
00:27:24,769 --> 00:27:26,996
[Matt Schaefer]: becomes extremely important in our world,
247
00:27:28,129 --> 00:27:33,669
[Matt Schaefer]: both through, like, trust and actual embrace of the application, I.E, the forecast that I'm gonna
248
00:27:33,970 --> 00:27:45,079
[Matt Schaefer]: rely on to run my business and you know, and drive some of the next four or five decisions that I'm going to make today. I think that just the way that we're incorporating that to have that human, kind of, readable
249
00:27:45,709 --> 00:27:59,541
[Matt Schaefer]: summarization of something that's super technical. Again, not the practitioner on the call, but, you know, I benefit from it, my business benefits from it, and my customers benefit from it every single day and I just really think that's been an incredible
250
00:27:59,918 --> 00:28:12,054
[Matt Schaefer]: game-changer from my perspective, at least from my vantage point. I don't know, Jason, if you share that? Or maybe to the, to the rest of the panel here, if there's anything that you comment or challenge in that statement?
251
00:28:14,856 --> 00:28:18,913
[Jason Harper]: No. I mean, I think, I think it's very similar to the way that we're we're seeing things. I don't...
252
00:28:19,788 --> 00:28:21,003
[Jason Harper]: I mean, I know
253
00:28:22,587 --> 00:28:27,604
[Jason Harper]: I guess what I’d say. It looked like Jacob, you're gonna chime in? I mean, do you have any thoughts on this? Go ahead, Jacob.
254
00:28:28,560 --> 00:28:33,610
[Jacob Newsted]: Yeah. Sorry. Well, I use it in my work life right now. We recently had a project
255
00:28:34,063 --> 00:28:36,152
[Jacob Newsted]: that had a large
256
00:28:36,605 --> 00:28:40,777
[Jacob Newsted]: table of customer information, feedback, sentiment,
257
00:28:41,232 --> 00:28:42,368
[Jacob Newsted]: all sorts of things
258
00:28:42,823 --> 00:28:50,652
[Jacob Newsted]: in a structured way, but it's like text and lots of other things. It's hard to go through everything and summarize it all. So, we used
259
00:28:51,344 --> 00:28:51,844
[Jacob Newsted]: ChatGPT
260
00:28:52,218 --> 00:28:52,718
[Jacob Newsted]: and
261
00:28:53,028 --> 00:28:58,525
[Jacob Newsted]: shoutout to this Python library if any techies are listening, Llama Index, makes it super easy.
262
00:29:00,454 --> 00:29:02,154
[Jacob Newsted]: We are able to
263
00:29:03,014 --> 00:29:03,834
[Jacob Newsted]: ask questions
264
00:29:04,614 --> 00:29:10,069
[Jacob Newsted]: to this table if that makes any sense, and it gives us summarizations. It gives us feedback.
265
00:29:10,949 --> 00:29:15,749
[Jacob Newsted]: What what's the overall sentiment during this time frame or during this event or something?
266
00:29:16,323 --> 00:29:16,823
[Jacob Newsted]: And
267
00:29:17,356 --> 00:29:19,286
[Jacob Newsted]: it just gives us human readable
268
00:29:20,057 --> 00:29:23,179
[Jacob Newsted]: insights into it, which is huge because whereas
269
00:29:23,488 --> 00:29:27,329
[Jacob Newsted]: where I used to do random data science, like
270
00:29:27,943 --> 00:29:28,443
[Jacob Newsted]: probabilistic
271
00:29:29,057 --> 00:29:34,576
[Jacob Newsted]: analysis on things, I don't need to do that. All we do is we have an app that a person
272
00:29:35,109 --> 00:29:36,403
[Jacob Newsted]: asks a question
273
00:29:36,856 --> 00:29:39,476
[Jacob Newsted]: and the data answers itself back to you.
274
00:29:40,129 --> 00:29:40,928
[Jacob Newsted]: Which is huge.
275
00:29:41,727 --> 00:29:46,042
[Jacob Newsted]: Because now it's not me that needs to do this. I make the app. Everybody else asks the questions,
276
00:29:46,920 --> 00:29:48,394
[Jacob Newsted]: which is huge. It’s transformative,
277
00:29:49,173 --> 00:29:55,646
[Jacob Newsted]: and it gives us the ability to do even more crazier things to hopefully answer everybody's questions.
278
00:29:56,605 --> 00:30:00,615
[Jason Harper]: I'll say having been, having used that specific
279
00:30:01,307 --> 00:30:01,784
[Jason Harper]: library,
280
00:30:02,419 --> 00:30:03,531
[Jason Harper]: all weekend and including...
281
00:30:04,103 --> 00:30:05,060
[Jason Harper]: Several hours of today.
282
00:30:05,857 --> 00:30:06,814
[Jason Harper]: It is amazing.
283
00:30:07,292 --> 00:30:19,261
[Jason Harper]: There are lots of pitfalls though, to using it. And on the technical side, number one, it expects to read in a folder, not a file. I spent two hours trying to figure out why it wasn't working. And so that's... Thank you, Dom for helping me with that this morning.
284
00:30:20,217 --> 00:30:30,971
[Jason Harper]: Yeah. No. It happens. But, what's also interesting is so, I was able to process in around three hundred thousand comments in a specific kind of another text customer experience thing we we're looking at.
285
00:30:32,577 --> 00:30:33,930
[Jason Harper]: But what's interesting is,
286
00:30:34,726 --> 00:30:49,596
[Jason Harper]: it's still not... It doesn't work on the first try. Right? So learning, like, even training and training samples and that. So, like, this is, this is not magic. Like, it might sound and put a magic wrapper around it, but like, getting
287
00:30:49,975 --> 00:30:51,685
[Jason Harper]: getting actual
288
00:30:52,621 --> 00:30:58,542
[Jason Harper]: insights out of data still requires a lot of thinking and time. It's not as simple as just
289
00:30:59,094 --> 00:31:05,715
[Jason Harper]: throwing a dataset set over the wall and then coming back with this magical thing that you can ask questions of it. It requires really intelligent,
290
00:31:06,924 --> 00:31:08,752
[Jason Harper]: an intentional organization of the data,
291
00:31:09,467 --> 00:31:23,300
[Jason Harper]: looking at it, reacting, responding to it and changing it, right? And so, incorporating, we've... We've had to make, you know, several iterations of changes to this. And I constantly have my little usage screen up to, like, look at my, like,
291
00:31:23,400 --> 00:31:36,400
[Jason Harper]: oh, that was four dollars. Oh, that was eight dollars. Oh, that was oh, that was fifty bucks oops. Like, just watching, like, the training of these things, it's very, you know, it's interesting, just it's very complicated. So, I'd say, like just... That's what I'm... The vision of what you're doing there
291
00:31:36,500 --> 00:31:50,920
[Jason Harper]: and what, you know, what we're trying to do and the sum of the stuff is, like, it's gonna be magic. It's gonna look like magic to the end user, but like to get there, there is... Oh, there's a lot of smoke and mirrors and putting things in place to actually make this thing happen.
292
00:31:52,277 --> 00:31:59,477
[Jacob Newsted]: It's actually interesting. So, speaking to that and specifically Llama Index since, unfortunately, I'm sorry you struggled with that one.
293
00:32:00,835 --> 00:32:01,335
[Jacob Newsted]: Literally,
294
00:32:01,794 --> 00:32:04,451
[Jacob Newsted]: the thing about that is it creates
295
00:32:04,910 --> 00:32:12,716
[Jacob Newsted]: it your prompts, your questions and everything is not just gonna be tailored to GPT anymore. It's actually tailored to your dataset.
296
00:32:13,445 --> 00:32:18,596
[Jacob Newsted]: Because I don't know if you, I don't wanna get too jargon-y, if you tried the vector store,
297
00:32:21,071 --> 00:32:23,243
[Jacob Newsted]: python kind of class? Basically,
298
00:32:23,619 --> 00:32:25,768
[Jacob Newsted]: it takes your data and it creates...
299
00:32:26,963 --> 00:32:27,440
[Jacob Newsted]: I don't know.
300
00:32:28,890 --> 00:32:29,630
[Jacob Newsted]: It creates
301
00:32:30,010 --> 00:32:30,670
[Jacob Newsted]: kind of
302
00:32:31,049 --> 00:32:36,250
[Jacob Newsted]: themes and ideas around it and it stores them in these kind of like semantic ways.
303
00:32:37,064 --> 00:32:42,810
[Jacob Newsted]: So even just asking questions of your data, one dataset is not going to have the same kind of semantic
304
00:32:43,502 --> 00:32:45,966
[Jacob Newsted]: relationships between every single row as the next one.
305
00:32:46,934 --> 00:32:56,405
[Jacob Newsted]: So you're not just fighting against GPT and prompting anymore. It's really gonna be an art of, and like you said, there's tons of tuning. It's lots of different things that are just gonna...
306
00:32:57,200 --> 00:33:03,957
[Jacob Newsted]: It's not magic. Well, we hope to make it look like magic. But it isn’t.
308
00:33:04,529 --> 00:33:12,233
[Jason Harper]: Yeah. It should feel that way to the users. I would add too to that. One thing I've I've felt myself like, so I I...
309
00:33:13,044 --> 00:33:21,500
[Jason Harper]: Was a coder at one point in my life. I think I can claim that. There's some people who would take massive exception to that. If Lary is listening, Lary is like you're not a coder Jason.
309
00:33:21,600 --> 00:33:32,878
[Jason Harper]: That's okay, Lary with one ‘R’ by the way. But so, at any rate, the... I think for me, like, it has empowered me to go write code again and, like I'm spending a ton of time, like, writing python. And so, the thing is...
310
00:33:33,752 --> 00:33:42,900
[Jason Harper]: But that's the thing. Right. I'm not I'm not a coder, I admit it. And so, like, I run into these little issues that even ChatGPT can't diagnose because it doesn't know the issue. And I spent hours on this thing.
310
00:33:43,000 --> 00:33:55,798
[Jason Harper]: And this morning, meeting with an actual coder, showing him my code. He's like, oh, yeah, that should be a directory, not a file, like, within seconds. And I was like, oh, got it. Okay. Thank you. But so I think like, there is sort of, like, this like,
311
00:33:56,995 --> 00:34:15,522
[Jason Harper]: fun exploratory stuff where there's gonna be a lot of skining of the knees. And I think that we're at a phase now where it's just not, it's super approachable and easy to access, but to actually get the true results out of it, it's still really hard. Like, it's really, really hard and requires, like, actual skills that are developed in people over a long period of time.
312
00:34:19,340 --> 00:34:30,048
[Anna Schultz]: Thank you all for those answers on that. I think it'll be really interesting to see how that all develops and, at RXA, that's our bread and butter, right? So that's kinda what we're, we're working on figuring out.
313
00:34:31,643 --> 00:34:35,392
[Anna Schultz]: One final question for everybody to kinda take things in a different direction.
314
00:34:36,603 --> 00:34:44,965
[Anna Schultz]: As these types of generative AI become more integrated into the business world, the skills required for professionals are evolving, as we touched on earlier.
315
00:34:45,940 --> 00:34:48,040
[Anna Schultz]: Do any of you guys have any additional thoughts
316
00:34:48,579 --> 00:34:53,220
[Anna Schultz]: or elaborations on the importance of candidates who can effectively utilize AI tools?
317
00:34:53,794 --> 00:34:56,109
[Anna Schultz]: Maybe how it'll impact the future of the job market?
318
00:34:57,067 --> 00:35:02,495
[Anna Schultz]: Additionally, maybe how companies can work to upskill their current employees to leverage these technologies more effectively?
319
00:35:06,102 --> 00:35:20,817
[Megan Foley]: Yeah. I guess I can start. So I feel like to kinda say, like, the fear of automated work is not something that's novel. It's been around since the 1800s, believe it or not. And there's a really interesting passage that I was reading from like a BYU professor
320
00:35:21,129 --> 00:35:33,800
[Megan Foley]: who pretty much said that this is not a new idea. And that every single year, we kinda come out with a new thing that, this is gonna replace the work. Like, we're not gonna work anymore. But really when you look past, like, past at it, it just really made everyone more efficient.
320
00:35:34,000 --> 00:35:47,460
[Megan Foley]: And, like, we're talking about data, so I might as well bring a data source into this. So, the world economic forum said that there was eighty-five million jobs that are going to be replaced by AI one day. But if you look at that same statistic, they say that ninety-seven million are going to be created.
321
00:35:48,033 --> 00:35:57,100
[Megan Foley]: So work really doesn't get replaced. It's just the work that isn't efficient, that people don't want to do. We didn't, we didn't want to plow the fields ourselves so we built a tractor, kind of thing.
321
00:35:57,200 --> 00:36:08,500
[Megan Foley]: So pretty much, that’s kind of what a lot of this has come down to. And you're gonna get left behind if you don't adapt, but at the same time, if you adapt and you become more efficient, you're just not gonna really, I feel, like see it in the future.
321
00:36:08,600 --> 00:36:20,560
[Megan Foley]: You're just gonna be looking back at it one day and you're like, hey, the iphone existed, it didn't exist twenty years ago. It does today and I can't imagine my life without it. That's really where I feel like this kind of conversation is going.
322
00:36:21,451 --> 00:36:27,869
[Mikayla Penna]: That was really well said, Megan, and I completely agree with everything you just mentioned, and I just feel like
323
00:36:28,386 --> 00:36:30,159
[Mikayla Penna]: it's really going to
324
00:36:30,693 --> 00:36:37,650
[Mikayla Penna]: not only automate the workforce even more, because like, that's what we've been doing for the past twenty years is automation, what
325
00:36:38,024 --> 00:36:42,578
[Mikayla Penna]: however you wanna look at it. But for me, it's like, it gets people to doing the jobs
326
00:36:42,889 --> 00:36:52,402
[Mikayla Penna]: that they were hired for and the skills that they shine at, and that they're good at, and then takes away, you know, those hard admin tasks that are just...
327
00:36:53,192 --> 00:36:57,885
[Mikayla Penna]: nobody wants to do. Right? And at the end of the day, you could look at it. It saves your, like,
328
00:36:58,999 --> 00:37:07,540
[Mikayla Penna]: overall workforce Payroll, you might not need to hire somebody to do x y and z. If these employees you have have the skills to know how to use AI,
329
00:37:08,112 --> 00:37:11,531
[Mikayla Penna]: To make them more efficient and work faster and better and smarter.
330
00:37:13,598 --> 00:37:13,836
[Megan Foley]: Yeah.
331
00:37:14,551 --> 00:37:25,890
[Megan Foley]: And think about all the jobs that didn't exist twenty years ago. Even like, four years ago. Like, a computer programmer or just stuff like this. So, I feel like it's such a cool experience that we get to see the future kinda get built here.
332
00:37:26,463 --> 00:37:38,400
[Megan Foley]: And there's jobs that haven't even existed yet today that are gonna be like, someone's job in the future. So, I just feel like that's a really cool thing. And it just shows you that if your company is willing to adapt. Like, Ready Signal,
332
00:37:38,600 --> 00:37:51,776
[Megan Foley]: we talk every day and we're like, okay, what's the new like, technology that we wanna see if we can work into our workflows? And we just kind of have conversations and bounce back and just adapt. It's a really cool way to jump on the train for the future and really just get more efficient.
333
00:37:53,541 --> 00:38:06,644
[Mikayla Penna]: And even if this doesn't pan out to be, like, unicorns and rainbows, like, us on this call think it'll be. At the end of the day, you're teaching yourself how to be more efficient. And how to use these skills, whether companies embrace it or not, just you as a person
334
00:38:07,024 --> 00:38:12,145
[Mikayla Penna]: can benefit from learning how to prompt and learning how generative AI works.
335
00:38:13,339 --> 00:38:20,066
[Jason Harper]: Yeah. I don't know. I'm curious if anybody else feels the same way, but... I use it so much. I am noticing that sometimes
336
00:38:20,682 --> 00:38:31,128
[Jason Harper]: I'm talking to people and asking questions. And it's... I think it's changing the way that I actually ask questions and talk to other people. I don't know if I'm the only one, but like, I've definitely noticed. I was like, oh, alright. That's different.
337
00:38:32,558 --> 00:38:33,670
[Megan Foley]: You prompt them like Chat?
338
00:38:34,464 --> 00:38:36,053
[Jason Harper]: Just like how I'm just like,
339
00:38:36,863 --> 00:38:43,700
[Jason Harper]: I think like how I, I honestly like with my wife, like, we're very different people, are wired very differently. She's extremely. She's a...
339
00:38:43,800 --> 00:38:59,250
[Jason Harper]: Pharmacist and so she's very focused and detail oriented. And I think, I think it's actually improved my communications with her because, like, I think I ask questions more clearly now, or in a different way that actually explains things better. So, I don't... Maybe I’m the only one. I don't know.
340
00:39:03,255 --> 00:39:13,500
[Megan Foley]: I definitely thought you were gonna talk about. Oh, sorry, Matt. But I definitely thought you were gonna, like, say, act like a Pharmacy expert for me and then just go off in different prompts that you're doing it. But I can definitely see how
340
00:39:13,600 --> 00:39:321,500
[Megan Foley]: you have to be more clear and concise because the robots can just you know, totally come out of a different direction if you aren't asking a very clear answer to them.
340
00:39:21,800--> 00:39:31,873
[Jason Harper]: I think I'm leaving some assumptions out when I'm asking questions. I'm actually, like just being a little bit more clear and not assuming quite as much because I've learned through this rapid,
341
00:39:32,428 --> 00:39:44,473
[Jason Harper]: like, you know, query, you know, prompt response, prompt response and like having to hone my questions, so instead of asking... Leave... just less assumptions. Right? So being a little bit more detailed, not a ton, but a little bit more
342
00:39:44,783 --> 00:39:48,760
[Jason Harper]: in my speaking, I don't know. I think it's doing something in my brain.
343
00:39:51,145 --> 00:39:52,418
[Mikayla Penna]: You're turning into a robot.
344
00:39:55,217 --> 00:40:01,024
[Matt Schaefer]: See at least it can still be the butt of a lot of jokes in the office around the water cooler. Guilty of it myself.
345
00:40:02,075 --> 00:40:05,934
[Matt Schaefer]: I think I just think this is a really an incredible, like moment in time to, like
346
00:40:06,474 --> 00:40:06,974
[Matt Schaefer]: watch
347
00:40:08,075 --> 00:40:11,569
[Matt Schaefer]: evolution happen in real time. And I think that it's really
348
00:40:12,007 --> 00:40:24,300
[Matt Schaefer]: opportunistic from an efficiency standpoint when you think about us, and our greatest resource is time and it's finite. And if I'm not spending eighty or forty or fifty percent of my time, whatever it may be
00:40:24,400 --> 00:40:32,784
[Matt Schaefer]: on some of these things that are menial tasks that could be automated, it's an incredible opportunity. And like when you put that into the scope of
349
00:40:33,163 --> 00:40:35,660
[Matt Schaefer]: work life balance and just
350
00:40:36,454 --> 00:40:36,954
[Matt Schaefer]: effectiveness
351
00:40:37,414 --> 00:40:50,475
[Matt Schaefer]: and throughput that my teams can generate that translates to value, and they actually embrace it. I think it is pretty awesome. Right again, this is like maybe the rosy-lens version of what this is, but I think it's incredibly fascinating. I think that, you know
352
00:40:51,571 --> 00:40:56,792
[Matt Schaefer]: it's just fun to watch this play out because I think we all can benefit and continue to
353
00:40:57,169 --> 00:40:57,669
[Matt Schaefer]: help
354
00:40:58,685 --> 00:40:59,185
[Matt Schaefer]: influence
355
00:40:59,642 --> 00:41:01,556
[Matt Schaefer]: this so that the robots maybe don't take over.
356
00:41:02,685 --> 00:41:02,924
[Mikayla Penna]: Right.
357
00:41:05,149 --> 00:41:06,579
[Anna Schultz]: Absolutely. Thank you all. Go ahead, Jacob.
358
00:41:07,374 --> 00:41:13,870
[Jacob Newsted]: No. Sorry. I was just gonna say that I find this kinda interesting because, like, previous technological
359
00:41:14,409 --> 00:41:24,645
[Jacob Newsted]: like, influences kind of didn't affect me. It was a big... Oh, it doesn't affect me at all. And AI affects all of us in every way. So, I kind of hope
360
00:41:25,019 --> 00:41:33,074
[Jacob Newsted]: that whereas it didn't affect me before, well, maybe now people take fifteen minutes out of their day to just read what the newest stuff is. Like,
361
00:41:33,711 --> 00:41:47,937
[Jacob Newsted]: I've done that before as like a researcher. I read papers, but that doesn't need to be your way of reading this kind of new AI news. Read a blog post, read whatever the new thing Google shoved at you is, and I just hope people try and learn things more often
362
00:41:49,111 --> 00:41:51,979
[Jacob Newsted]: and react to this rather than just have it hit them like a truck.
363
00:41:53,015 --> 00:41:59,779
[Jacob Newsted]: BScause it could happen. It could, it could happen if you're not aware and ready to read all these things and learn. So that's all.
364
00:42:01,383 --> 00:42:07,195
[Anna Schultz]: Absolutely. To add that, I mean, I think just playing around with the OpenAI ChatGPT has been really helpful,
365
00:42:07,912 --> 00:42:19,351
[Anna Schultz]: you know, just going in there and asking different prompts and kind of learning through trial and error too, you know, seems to be, personally, a great way to kinda play around with it and test it out and get better at it. So
366
00:42:19,903 --> 00:42:20,381
[Anna Schultz]: Absolutely.
367
00:42:21,656 --> 00:42:29,280
[Anna Schultz]: Cool. Well, I guess there you have it, an incredible deep dive into the captivating world of ChatGPT and generative AI with
368
00:42:29,717 --> 00:42:32,099
[Anna Schultz]: very, a ton of invaluable insights from our guests.
369
00:42:32,972 --> 00:42:36,784
[Anna Schultz]: We hope you all found this episode as enlightening and inspiring as we did.
370
00:42:37,833 --> 00:42:42,696
[Anna Schultz]: And our heartfelt thanks goes out to our experts for sharing their time and their knowledge with us today.
371
00:42:43,892 --> 00:42:55,774
[Anna Schultz]: To our audience, if this episode sparked your curiosity and got you thinking about the incredible potential of ChatGPT and generative AI for your business, b sure to hit the like button and subscribe to the Real Intelligence podcast.
372
00:42:56,493 --> 00:42:59,609
[Anna Schultz]: We'd love to hear your thoughts, so don't hesitate to reach out and engage with us.
373
00:43:00,567 --> 00:43:14,285
[Anna Schultz]: And if you're ready to explore how ChatGPT can revolutionize your business, our team at RXA and Ready Signal are here to guide you on this transformative journey. Please reach out to us at
[email protected] to Io and let's unlock the full potential of generative AI together.
374
00:43:15,085 --> 00:43:20,780
[Anna Schultz]: The Real Intelligence podcast is presented by RXA, a leading data science consulting company.
375
00:43:21,340 --> 00:43:24,480
[Anna Schultz]: RXA provides project based consulting, staff
376
00:43:24,780 --> 00:43:26,880
[Anna Schultz]: augmentation, and direct hire staffing services
377
00:43:27,180 --> 00:43:33,869
[Anna Schultz]: for data science, data engineering, and business intelligence to help our clients unlock the value in their data faster.
378
00:43:34,901 --> 00:43:40,897
[Anna Schultz]: Learn more by visiting our website at www.rxa.io
379
00:43:41,351 --> 00:43:45,484
[Anna Schultz]: or contacting our team at
[email protected] today.