Secrets of AI Enablement: Bringing AI to Value Faster – with Sidharth Ramachandran, RTL Deutschland
Shownotes
Sidharth Ramachandran (Senior Director Data & AI, RTL Deutschland) shares how RTL’s AI Incubator helps bring GenAI ideas into reality in 8–20 days – from first idea to a working prototype. He also shares what makes AI work different from “deterministic” software, why enablement and hands-on workshops matter as much as engineering, and how to scale adoption without turning governance and permissions into a bottleneck.
This episode is part of our DATA Festival series, featuring speakers from our upcoming event in Munich. Stay tuned for more exciting insights from industry leaders sharing their cutting-edge projects and innovations.
Learn more about RTL's AI journey on stage at the DATA Festival Munich in June – one of Europe's leading events for data, AI, and technology leaders.
👉 Save your spot now: https://hubs.li/Q044Z4qG0
Wolfgang Klein on LinkedIn: https://tinyurl.com/28ja2pyk Florian Bigelmaier on LinkedIn: https://tinyurl.com/4z84k8v7 Carsten Bange on LinkedIn: ttps://tinyurl.com/4j96bfnf BARC on LinkedIn: https://tinyurl.com/3ft3vpxv
Transkript anzeigen
00:00:00: Let's staff up a whole team, let's create a project plan.
00:00:04: Let's create road map.
00:00:05: all of that is important.
00:00:06: I'm not taking that away from it but if you are able to prove the value with the live working prototype then selling business case further makes way easier.
00:00:34: Hello and welcome to the Data Culture podcast.
00:00:37: I'm Carsten Banger, founder & CEO And with me today again Florian Wiegelmeier.
00:00:42: That tells you it's a data festival edition.
00:00:45: Again we
00:00:47: host an interview
00:00:50: guests
00:00:51: or speakers of upcoming data festivals.
00:00:56: Today We have pleasure talking to Zidad Ramachandra.
00:01:01: He is a senior director for data and AI at RTL, RTL Deutschland the media company.
00:01:10: And he's leading The AI Incubator!
00:01:13: That is what we are going to talk about.
00:01:14: Absolutely He is both doing AI Enablement and AI Engineering which means his team is delivering actually AI solutions.
00:01:23: But we more talked about actually the enablement part, because they thought it would be super interesting to figure out how are you doing training?
00:01:30: How I did during enablement.
00:01:32: How are there doing the education?
00:01:34: and how are they dealing with some kind of sentiments maybe on also stuff where people don't disagree like right away that AI might be the right solution so also resolving some of their conflicts.
00:01:47: but
00:01:48: With dad i think We can jump Right Away Into The Discussion.
00:01:51: Houston Good to have you here.
00:01:55: Nice to meet you
00:01:56: absolutely I see that great to have your at the data culture podcast.
00:02:01: So, why don't we start right away with one question?
00:02:05: We saw that your role is a senior director data and AI at RTL And You're leading the Chen ai incubator there.
00:02:13: like what does your role their?
00:02:15: what are your priorities about?
00:02:19: Yeah.
00:02:19: So as I said, I am part of RTL data.
00:02:23: within RTL Data we have a unit called as AI Enablement and Engineering And then I lead the AI incubator team with.
00:02:32: in that our role is to basically help bring GenAI projects much closer to reality In fast an easy way.
00:02:45: so with the whole JNAI wave and model improvements, there are a lot of ideas that can be proved out very quickly.
00:02:55: Also now even more so using coding agents like this.
00:02:59: So you don't necessarily need long process for understanding projects deciding which project to build.
00:03:08: You could in eight-to twenty days as a prototype to really prove the stakeholder whether it works or not.
00:03:22: At the end, you have something that is actually testable usable.
00:03:26: And so users also get a sense of OK?
00:03:29: Is this what I had imagined?
00:03:31: This going to work or it's just more effort to make even better Or is like whoa!
00:03:36: It already works...I don't need anything else and done.
00:03:39: That was our role in AI incubator.
00:03:46: Till now we've been doing a lot of projects, so more in like the hundreds basically.
00:03:52: And not all of them have worked which is also part of the journey because you want to get rid off the projects that don't work and not spend too much time on those and focus onto things that really bring value and then build it out as a proper product.
00:04:07: with an engineering team with a backlog.
00:04:13: We talked to a lot of companies that try to figure out how to bring AI really into production because we see these big barriers.
00:04:21: Was there the reason why your team has been founded?
00:04:24: Definitely, so I think it started off almost... Yeah, four years ago or maybe let's say three years after the whole chat GPD moment and we already had a collaboration with Microsoft where we were at the Ignite conference showing what is possible with Dali.
00:04:39: How do you integrate it into RTL Plus?
00:04:41: And so this was sort of the genesis like really the project where we said wow!
00:04:46: It's so cool This thing...and its'so fast.
00:04:48: So if your able to get in to this fast mode Of working which or is not typically the norm.
00:04:55: And, The idea is that you go through the entire prototyping and value delivery mechanism much faster instead of relying on let's staff up a whole team Let's create a project plan.
00:05:09: Let's Create a roadmap.
00:05:11: all of That Is important.
00:05:12: I'm Not taking that away from it but if You are able to prove the Value with the live working prototype then selling the business case further makes it way easier and you can enter that project with way more confidence than just prototype PowerPoint slides or something like.
00:05:33: Yeah, next level of sense.
00:05:34: before we dive into how to do that I noticed RTL data could talk a little bit about where your are embedded.
00:05:44: is that like a data office?
00:05:46: Or what's this unit?
00:05:48: Yeah, also very good question.
00:05:50: So RTL data is like the centralized competency center within RTL.
00:05:57: so RTL Data encompasses a lot of activities across the entire data landscape and Also includes audience research panelists marketing studies but also more tech-driven initiatives like a centralized data platform, specific Data Driven projects and then now increasingly AI driven projects an AI competency.
00:06:22: So it's really very strong unit that lives at the heart of RTL And supports customers and stakeholders across different.
00:06:31: so be it marketing beat content Beat production post production competent center for everything RTL data or data related.
00:06:45: Yeah, how many people are in that Center?
00:06:48: I think it is roughly hundred to one and fifty.
00:06:51: am not sure of the exact numbers but yeah if That's roughly this size.
00:06:55: now accent.
00:06:56: so
00:06:57: what you're saying like there's a bigger competence Center for data.
00:07:02: The immediate question i'm having Is It bit Like How does Jenny I the work with generative AI models defer from what you did before data analytics business intelligence, maybe machine learning and so on.
00:07:15: Can you lead us a bit through it?
00:07:17: And also like is there also nuance between incubator and center of excellence?
00:07:22: um Maybe put that two questions together.
00:07:25: Yeah So I think it's very interesting.
00:07:28: It is a journey that i have also lived through, so in the past we were building machine learning driven products or data analysis-driven products to help stakeholders make decisions.
00:07:41: there was a long cycle maybe over ten years where you educated your stakeholders on what Machine Learning is and what the accuracy of model is.
00:07:51: The fact its never going be hundred percent accurate.
00:07:55: you kind of build an entire education process around telling them, hey search results will... what is the mean reciprocal rank for example?
00:08:03: For a search result.
00:08:04: So explaining that KPI and so on.
00:08:07: And I now find myself in this situation where your stakeholders have become way broader than simply before because earlier we were selling to specific areas within a particular organization which were already data literate.
00:08:25: I would say they had an idea of what the data looks like, what analytics look like and some examples of machine learning methods is.
00:08:33: And now suddenly you're talking to stakeholders who are more exposed to traditional software engineering and not really data products.
00:08:42: So they have the same expectations of AI as there would of software engineering.
00:08:46: Hey, here is a solution.
00:08:48: this tool does what it says and It always Does What It Says.
00:08:52: And This Is Where It Becomes Challenging Because With AI Driven Solutions Its Never The Case That It Will Always Work.
00:08:58: There Is Always An Inherent Sense Of probabilistic nature, there is a sense of accuracy that it will never give you hundred percent.
00:09:06: What does hallucination look like?
00:09:09: And stuff like this... It's still very helpful but it is not deterministic software and so I'm having to re-explain this in many of our conversations and get people again used the fact that hey!
00:09:23: This isn't software..it won't always do and you need to give feedback.
00:09:27: And what does the Feedback mechanism look like?
00:09:29: How do we use The Feedback Mechanism To improve the product itself, so this I think has been the biggest change for me in moving from a data-driven products landscape to more AI driven projects.
00:09:44: Then explaining this whole thing... ...to your second part of question which is the Center Of Excellence.
00:09:50: I think RTL Data as a whole Is considered the center of excellence.
00:09:54: So We don't have Let's say a subunit, let's say within that.
00:09:59: So be it a competent center on data analytics, data engineering or AI now... That is sort of RTL Data.
00:10:06: so I think in a way that's the role we are playing.
00:10:09: also whenever there is partner conversation to be had and new tooling introduced We're of course consulted into process for bringing our expert opinion And help stakeholders in making their decision are also, uh...in the process of introducing this.
00:10:27: In the organization through means of education and training things like that.
00:10:33: so yeah I guess we play a role as center for excellence but just don't officially call ourselves.
00:10:38: One additional question on differences which you talked about so well um..about deterministic parts software and probabilistic part of AI.
00:10:51: Do you maybe have an example for that, which makes it a bit more tangible?
00:10:56: How does that translate to the real world of media business so to speak.
00:11:01: Yeah So I mean we have a lot of examples.
00:11:04: but there is For example The way You edit A particular video.
00:11:10: So, there is a certain style.
00:11:12: There's a certain creative flair that you bring which can also be done through agentic processes or assisted with agent-driven processes.
00:11:21: so when you cut a trailer for example... ...there are certain elements to look at in the trailer and these are naturally or is naturally a creative process, right?
00:11:36: So there's somebody looking at it trying to understand the story.
00:11:40: Trying put all of this together and Naturally This does not lend itself very nicely to software because There Is A Creative Person That'S Involved There.
00:11:49: And They Are More Like Saying Hey!
00:11:51: This Is Kind Of My Job.
00:11:53: How Am I Supposed To Automate This?
00:11:55: And i think The Role Of AI Here Is Not To Really Automate But To Really Assist.
00:12:02: And that means, you provide this editor or created person with a list of suggestions based on their criteria.
00:12:10: Based on certain guidelines and previous episodes which have had trailers generated like these.
00:12:19: This will not always be the same because the criteria themselves are not rigid set-of rules.
00:12:25: so in deterministic software always give me a scene where there are two people talking and Always, Give Me A Scene Where There Is An Explosion And So On.
00:12:36: This is some deterministic way to kind of assist.
00:12:41: but when you're talking about doing this in an AI-driven Way It's the definition Of An Explotion The Same Each Time.
00:12:49: No If There Are Two People Talking But Ten People In The Background Is That Still People Talking To Each Other?
00:12:56: These Are Places Where it's not so hard and sorry, its' not easy to distinguish.
00:13:02: And this is also where AI... It isn't that easy for any AI too make these judgments.
00:13:07: So will you get the same set of suggested clips each time when your trying cut a trailer?
00:13:14: For particular video?
00:13:15: No!
00:13:16: The types of clip we see differ each time.
00:13:21: This whole journey I think need go through.
00:13:24: That software no longer decriminalistic but it's probabilistic and what does that mean?
00:13:30: Absolutely.
00:13:31: Now let's dive into, say your daily job to bring AI use cases fast in production I think is super relevant an interesting topic.
00:13:41: so What are the topics you're focusing on?
00:13:49: So where do you think?
00:13:52: success factors
00:13:55: I think there's, so as a media company what i really love about working at RTL is the fact that I get to work on SO many use cases.
00:14:02: So it's not just video editing which we have been really focusing on but also creation of texts and marketing images.
00:14:12: But Also We Have Our Own Sales House.
00:14:15: It's A Lot About CRM Automation And Things Like This.
00:14:19: The wide variety Of Applications In Our Use Cases We're not saying as the AI incubator, there's only one that is doing it.
00:14:28: There are also several colleagues we work with directly and enable in various parts of the organization to adopt AI stuff like this.
00:14:36: so my day-to-day kind I would say has two main components.
00:14:42: on first hand how i put more evangelizing or training or AI diffusion really talking people trying to help them understand what AI is.
00:14:56: Which tool kind of fits which use case?
00:15:00: What is the use-case, Is it actually possible to do or not?
00:15:03: and so really opening their minds doing a bunch of trainings Doing some workshops And getting everybody on the same page in terms Of language that you used when talking about AI.
00:15:14: What are tools we have access to?
00:15:17: How can you use these tools And where do you need, let's say more specialist help for a dedicated solution or something like this?
00:15:26: So that brings me to the second part which is I feel there are lot of things people can really do themselves.
00:15:34: For example Do You Need To Build A Custom Chat Part For Each and Every Use Case?
00:15:39: Probably Not!
00:15:40: You Can Go Into ChatGPT Build Your Own Custom GPT Today If You Know How To Do It Or If You Are Aware Of it.
00:15:47: That Is The First Part Where i Help Make We enable people themselves to do it, right?
00:15:52: Because this is the beauty of AI because now everybody can kind of do it themselves.
00:15:57: But on the other hand when you talk about more permanent like solutions or solutions that are a bit larger and extend beyond our team and comprise multiple aspects for workflow then things start becoming a little complex.
00:16:13: For example if we consider a very typical data use case where you want to build an automated data analyst, right?
00:16:21: It's something that we have been used to doing.
00:16:24: You write SQL query, you'd built some visualizations and so on but the key things here is you need to have this AI analyst connected to all of your data sources.
00:16:34: it not just your data source is buried in BigQuery or wherever they live...it also needs to be connected with the right permissions So everybody should have access to all tables because it might have sensitive data.
00:16:46: The context for this information might lie or the documentation, right?
00:16:51: So what are different columns look like.
00:16:53: What do they mean?
00:16:54: that might lie in a completely different system be it confluence of any other knowledge depository you're doing and now expecting stakeholder to build an AI analyst themselves is probably attached too far.
00:17:07: This sort where incubator comes which was second part my job really conceptualize these solutions a more rigorous fashion where we are respecting rights, privileges building the necessary connectors and so on.
00:17:22: So this is what AI incubator and surrounding teams around that really support which is to say hey some solutions although they're AI driven solution there's not things each every team can build themselves And thats whats built for them and kind of build it in proper way.
00:17:40: This I think its two parts.
00:17:42: my role um as leading the AI incubator.
00:17:46: You already talked a bit about the education part or actually quite, quite a bit.
00:17:51: why do you think is that so critical?
00:17:53: Why is it so important?
00:17:55: uh as part of your journey I mean you could also just go there and say where are the AI experts?
00:18:00: we're gonna build that and bring it through to company because these days we see like a lot of companies that rather try to enable their people out there.
00:18:12: Yeah, what's your reason to do it that way?
00:18:17: I think the reason for me is really exciting because i believe AI can act as a sort of equalizer.
00:18:26: To some extent and its like general purpose technology.
00:18:30: That means yes someone with background in data science and having built models before.
00:18:38: we understand more about how these models are trained kind of where our expertise comes in, but as a consumer facing technology we are at a place where each and every person can enable themselves to do so much more than what they're doing now.
00:18:55: They can automate various aspects of their work themselves without needing a tech team and the software engineer or Android data engineer.
00:19:08: Pretty much everybody is now familiar with Office tools, right?
00:19:11: So you kind of know how to use Excel.
00:19:12: You can have no how-to write a PowerPoint presentation and so on.
00:19:16: And when you make the Word document there are some Word documents that we used only to share within your team or with your manager.
00:19:25: Then There Are Some Word Documents Where We Put A Lot More Effort Into.
00:19:30: This Is Meant For Publication Like Either As A Blockpost Or Press Release By A Company.
00:19:37: I believe we are in entering a similar era of software solutions, just like Word documents.
00:19:44: So there is some software solution small apps or helper tools that you use for yourself and within your team.
00:19:52: it solves the problem And thats it!
00:19:56: It doesn't need to be super production ready, it does not have to scale to million users.
00:20:00: A Word document is similar to an app which works with you.
00:20:06: great but there are some apps that you really want more people to use and stuff like this.
00:20:12: And then, You do need good software engineering practices which is where teams like us come in.
00:20:18: But the fact we want enable you build these small tools yourself be it a custom GPT or agent what works for your limited context already helps you directly.
00:20:31: This is where I think that the real power of AI, when everybody in their organization isn't able to do this themselves maybe there's obviously like i said greats doing it some word documents.
00:20:44: just write for yourself and your boss create so you can go to a conference and talk about it.
00:20:53: In the same way, there will be some tools or agents that you'll build for yourself Some tools or agent's that we built your team Or for our boss or small set of people And then some will really be company wide which needs to be properly.
00:21:06: But things are doing on their own.
00:21:08: You easily do them today given the set of tools and technology.
00:21:12: That is why I think its important to enable everybody because they're the ones with domain experts.
00:21:20: I am not a video editor.
00:21:21: I don't know what a good trailer looks like, this is something that's in the minds of somebody who does this role and if i can enable them to help themselves That's like The Productivity Booster!
00:21:32: That's Like The Awesome Feeling They Get Because they are able To Do Their Job Much Better Than Me Helping Them Build Some Solution Or Something Like This...I mean there Is A Place For That.
00:21:43: But If You Are Able To Help Yourself That's like awesome and it gives you such a nice feeling because I learned something new, built something cool.
00:21:51: And now i'm able to do so much
00:21:55: in terms of enablement what did You found were the most effective means?
00:22:01: Like certain maybe trainings or e-learning?
00:22:06: how Do you run the enablements?
00:22:08: but Most importantly What do you think works best?
00:22:13: I think what works best for me is... So we've done the whole bit, right?
00:22:17: We started off with trainings like prompting and all these kind of things.
00:22:21: Like maybe two or three years ago that was more a large round.
00:22:24: so just get people introduced to the idea of what AI is.
00:22:28: What does a prompt look like?
00:22:30: Give them some basics Some guardrails And stuff like this.
00:22:33: But The thing really stands out from me Is doing small group workshops directly domain experts to solve problems in their domains.
00:22:45: So a typical way I kind of do this is let's say you are talking to fiction writers, right?
00:22:51: You go and have small group of four-to-five fiction writer.
00:22:54: they're the ones who writing dialogues or scenes for various episodes like these and ask them how did your job What is the biggest bottleneck today in doing your job?
00:23:06: And then they explain to you what are annoying parts of their jobs, because I think that's something that brings joy.
00:23:17: But there're aspects for everybody who don't like so much.
00:23:21: Admin tasks may be a bit annoying and stuff.
00:23:24: I'm sure nobody likes to format stuff in a Word document, right?
00:23:29: Like you want to write the ideas that go into their documents.
00:23:31: But should we be focusing on formatting?
00:23:33: or maybe now... You need to do a lot of Google research to find out about a particular topic.
00:23:40: so We take these annoying bits as use cases and show them Right there In The Workshop how i Or How could you Use A bunch Of These Things To Make That Part Easier.
00:23:52: And I think this serves two purposes.
00:23:55: One, it grounds in their context because they can also see the output of such a tool and tell you whether is actually good or its bullshit Because this aspect which need to learn.
00:24:13: AI is not some magic tool.
00:24:15: A lot results that produces are so called AI Slop.
00:24:21: So as a domain expert, you're able to tell when something actually looks good or reads good and something is sloped.
00:24:27: They also get this sense of hey what can do?
00:24:30: What it cannot
00:24:31: do?".
00:24:31: And things like.
00:24:32: that's the first part.
00:24:34: so You really ground in their lives In there domain.
00:24:38: The second part It shows them AI isn't here To replace me.
00:24:43: I am still valuable.
00:24:46: I am getting rid of these annoying bits so that i can focus on the things, That I really enjoy doing.
00:24:54: So it makes The whole experience like a co-working session... ...so you are co working with an AI colleague or something Like this and You leave the workshop feeling A lot more positive.
00:25:07: uh..You Leave the Workshop With a sense Of wow because your mind has opened up to possibilities like the best example.
00:25:13: um....I heard at the end of such a workshop was saying hey I can now imagine that if i had hundred colleagues with me or a hundred interns working with me, what more could I do?
00:25:27: So you and that's...that statement is something somebody told at the end of workshop.
00:25:33: And this really opens up your mind to the possibilities!
00:25:37: That's kind of outcome-you want..and not on other side where worried and stuff like this.
00:25:47: So, so... This is very important because people realize that AI isn't a golden bullet or something like that.
00:25:54: it's actually many times pretty crap in terms of output but thats where the human element comes into improvement and things like that.
00:26:01: for me its what I have always enjoyed doing.
00:26:05: That's really interesting cause.
00:26:07: we can tell maybe that we at Barchai also appeared with AI And a use case driven approach is something that really sticks with a lot of people.
00:26:16: That works pretty well because you can see easily how people adopt it, You can go step by step and iteratively... Particularly like I said It also gives the people a chance to figure
00:26:28: out what their role
00:26:29: will be in a world where AI is there.
00:26:33: Because for first time when we get into contact with them and they are killing all white color jobs It feels super hard and then you kind of find the way like, okay now I see where my domain expertise is playing a role.
00:26:48: finally.
00:26:48: Right?
00:26:50: Absolutely right!
00:26:51: These job loss fears are really dominating headlines today... And i don't think they're wrong.. We should not dismiss them.
00:27:00: there's genuine fear.... I also experience it every day because job was more like writing code or something, a lot of the agents can do that now.
00:27:13: So should I be scared?
00:27:14: That's genuine worry.
00:27:17: but as humanity we've always been super creative as a species have evolved so much starting from just fire to steel, uh...to the industrial revolution and then into internet.
00:27:34: And stuff like this.
00:27:35: we are super creative species um..and We always find something new using technology.
00:27:43: So the perfect example that I always talk about is apps, like ride-sharing apps.
00:27:48: Like Uber and stuff like this could not have existed if you didn't have smartphones on everyone's fingers.
00:27:57: And it was something to say Hey!
00:28:01: Is hurting the lives of existing taxi drivers.
00:28:06: That's true.
00:28:07: This where government policy regulation comes into being But anybody sitting right now in my home can order a cab was mind-blowing, I don't know maybe ten years ago or something like this.
00:28:19: And i think that AI also should be seen as the tool that will lead us to more cool things...I always struggle to convey this sense of imagination and myself I'm struggling.
00:28:32: imagine what is the new thing that enables AI.
00:28:35: but I think we've seen in the industrial revolution, like you started Industrial Revolution in factories but then there were cars and they're railways.
00:28:43: And at that time when just saw your first team engine working could imagine one day this would lead to for a T-Model or something?
00:28:56: If i was here can i imagine it?
00:28:58: because our imagination is also limited very often.
00:29:02: That's kind of what always encourage people to imagine.
00:29:06: what more would you do with this?
00:29:08: What more can you do, and somehow convey that sense of positivity human creativity rather than focus on the negatives.
00:29:16: I think the negatives are important And i think they should be reflected on for sure.
00:29:22: but where Do You Spend Your Energy ?
00:29:25: I Think That's What I'm Always Conveying To People.
00:29:30: Let'S Be Positive Let'S Adopt This Let'S See Where It Goes.
00:29:33: Yeah,
00:29:35: I like that.
00:29:36: And for sure we will see again a new roles emerging so people will be doing you things.
00:29:44: one role i spoke lately about was the agent supervisor.
00:29:49: as long Look a little bit also on the other side of what you said, your doing.
00:30:01: You said it's enablement when we talked about that but its' also about evangelizing and I think is super interesting!
00:30:07: You already mentioned a few aspects i think like for example make-it relevant yeah?
00:30:12: Like an example from day to day work.
00:30:14: But maybe you can share some more success factors again.
00:30:20: so how would you evangelize most effectively in your organization at RTL?
00:30:29: So first of all, it's the training and education piece.
00:30:33: I think that is most important because people then buy into it.
00:30:36: The second aspect of this is also showing something rather than talking about It.
00:30:43: This was where the idea for Incubator came in.
00:30:47: In typical large organizations there are a lot pulls and tugs because different people have different priorities, so on.
00:30:58: So the ability to kind of already show this solution like you hear an idea?
00:31:07: And then in a week... The follow-up meeting that has something to show is mind blowing for a lot of people Because they can already touch or experience their solutions.
00:31:19: They are able to tell even there Okay, that doesn't work or no I thought it would work this way.
00:31:25: So you already get awesome feedback which you didn't have if you were still in the PowerPoint slide.
00:31:31: and there are a lot of requirements That are not explicitly stated In anything.
00:31:38: they're just in the minds of people.
00:31:40: They say hey i want solution that does x y & z but actually we also want A B N C. But themselves did know They thought it was obvious.
00:31:51: And then they see the solution live, and then say hey but why does not do A?
00:31:56: We're like oh you didn't tell us that!
00:31:59: Then their answer is oh I though this was obvious.
00:32:01: so also having something there showing that asking them to try already makes it obvious communication gaps And, you know it brings out these communication gaps way faster and quicker which I think is also a large part of building any solution.
00:32:19: So i think that's another thing that I really uh also yeah found that this incubator style working is just way faster an easier
00:32:30: Absolutely!
00:32:31: Uh so thanks for leading us through like our your work at ATL and what we're doing around.
00:32:38: the One open topic, of course is we will have you as a speaker at the Day of Festival in Micsun in Munich and We would obviously love to know what will you talk about?
00:32:50: Can you give us a little teaser about that?
00:32:54: Yeah for sure.
00:32:54: I think um The journey of AI adoption Is kind of What uh You Know we're talking About but i Will maybe go A bit more Deep In That present some funny aspects or criteria of it.
00:33:07: So I imagine to be a more humorous discussion because there are so many learnings that we've had in the last three years where you would expect something and then reality is something else, Because other stakeholders have super unrealistic expectations or think that, why isn't this already happening when actually there is a lot that needs to happen before.
00:33:36: That goes there.
00:33:37: There's also a lot of hype around it.
00:33:40: so both on the positive side and negative sides On one hand overselling what AI can do?
00:33:49: Also creating job loss fears where people are reluctant to share or engage because they're actually actively pushing back against the use of AI in one, other aspects and things like this.
00:34:04: So I think their talk there will be way more humorous.
00:34:09: talking about some of our instances of things that worked also have been work on simple thing which let's say are not so obvious.
00:34:19: That sounds great.
00:34:21: And here's a secret, you have been recommended to us and someone said your talk is really fun.
00:34:28: So I'm looking forward it!
00:34:31: This will be June the sixteenth and seventeenth.
00:34:34: See you there at the latest.
00:34:35: Thanks for sharing all of our insights today... ...and i am looking forward meeting with you in person in Munich at the Lederfest.
00:34:42: Looking forward doing this
00:34:45: myself
00:34:45: too.
Neuer Kommentar