Get In Touch
Ch. du Vernay 14a
Gland 1196 CH-Vaud
[email protected]
Ph: +‪41 21 561 28 48‬
Back

A week full of AI and LLMs

I had the privilege of chairing the AI and Big Data summit organized by TechEx, which drew significant interest from professionals and enthusiasts eager to explore the latest AI and LLMs developments. Additionally, I participated in the Productcamp Europe event hosted by Epam Systems and Emakina. This gathering provided an excellent platform for discussions on various topics connected to data, AI and machine learning, but coming from the product angle. The complete recordings of TechEx can be found here.

The Interplay of Data Science and Business Strategy: A Glimpse into TechEx’s Applied Digital, Data & AI Conference

Applied Digital, Data & AI Conference

The attendees, a mix of industry leaders, data scientists, and curious minds, were there for one reason: to delve into the future of Applied Digital, Data, and Artificial Intelligence. As the General Partner at Allegory Capital, I had the privilege of opening the conference, setting the stage for a day of rich discussions and groundbreaking revelations.

The first topic that caught everyone’s attention was the presentation by Sanchit Juneja, the Director of Product on Data Science and Machine Learning at Booking.com. Sanchit’s discourse on Large Language Models (LLMs) was enlightening. He emphasized how Booking.com had already integrated LLMs into their workflows and was in the process of scaling these initiatives. The company’s ability to handle enormous volumes of data daily was a testament to the efficacy of these models. This revelation is a technological milestone and a harbinger of what’s to come for various industries. The integration and scaling of LLMs are no longer confined to tech giants or pure digital players. Traditional sectors, including healthcare and finance, are also beginning to realize the transformative potential of these models. However, the challenge lies in scaling these initiatives beyond the proof-of-concept stage, a hurdle many industries have yet to overcome. To illustrate, Sanchit shared how Booking.com uses LLMs to enhance customer experience. The LLMs can generate personalized travel recommendations by analysing customer reviews and feedback, increasing customer engagement and revenue. This practical application is a compelling case for how LLMs can drive measurable value across different business functions.

The conversation around Large Language Models naturally segued into a broader discussion about the future of retail and software development, led by Peter Winkelman, Chief Enterprise Architect at Carrefour, and Nuno Carneiro, AI Principal Product Manager at OutSystems. Their keynote presentation was a deep dive into Carrefour’s AI vision and how it aims to bridge the tech talent gap while achieving digital transformation goals. Carrefour’s strategy is not just a blueprint for retail; it’s a lens through which we can view the future of AI in business. The company is investing heavily in AI to automate processes, enhance customer experiences, and even transform the very nature of software development. The implications of this are far-reaching. As businesses strive to close the tech talent gap, AI emerges as a powerful ally, automating tasks that would otherwise require extensive human intervention. Peter and Nuno also shared a case study that demonstrated the transformative power of AI in retail. Carrefour implemented an AI-powered inventory management system that streamlined operations and led to significant cost savings. This real-world application serves as a testament to the untapped potential of AI in transforming business operations and achieving strategic objectives.

But the conference wasn’t just about celebrating the achievements and potential of AI and data science; it was also a platform for discussing these technologies’ ethical and environmental implications. Kristian Kofoed-Solheim, Business Development Director at Bulk Data Centers, took the stage to discuss how companies can manage High-Performance Computing (HPC) and AI workloads sustainably. Sustainability in the context of data science is often overshadowed by the allure of technological advancements. Yet, as Kristian pointed out, the environmental footprint of data centres is a growing concern that cannot be ignored. Companies are now faced with the challenge of balancing scalability with ecological responsibility. Kristian’s talk was punctuated by an example from Bulk Data Centers, which has implemented renewable energy solutions to power its data centres. This initiative reduces the company’s carbon footprint and serves as a model for other organizations grappling with the environmental impact of their data practices.

The conference then shifted its focus to the concept of an “Augmented Workforce,” a term that has gained significant traction in the lexicon of data science and machine learning. The panel discussion, moderated by Marloes Pomp, Coordinator of International Network ELSA labs, explored the rapid rollout of intelligent automation solutions, leveraging technologies like Natural Language Processing (NLP) and Robotic Process Automation (RPA). Integrating machine learning algorithms into the workforce isn’t merely a technological evolution; it’s a paradigm shift in conceptualising labour and productivity. The panellists discussed the concept of “Human-in-the-Loop” (HITL) machine learning, where human expertise complements algorithmic decision-making. This symbiotic relationship between human intelligence and machine capabilities constitutes an augmented workforce. Rens van Dongen, Senior Information Security Officer at Dutch Railways, shared an intriguing case study on how they implemented a HITL system for cybersecurity. Using machine learning algorithms to filter and prioritize security alerts, human experts could focus on complex tasks requiring nuanced understanding, enhancing the overall efficiency and effectiveness of their security operations.

As the day progressed, Artificial Intelligence in business took centre stage, with Erbin Lim, Director of Engineering and Development at Pfizer, delving into the technical intricacies of AI deployment. He discussed the importance of “Feature Engineering,” the process of selecting the variables that the machine learning model will use for making predictions. He also touched upon “Hyperparameter Tuning,” a technique used to optimize the performance of machine learning models. Erbin’s talk was a masterclass in the technical considerations that underpin AI’s transformative power in business. He emphasized the need for algorithmic transparency and discussed the frequency of model retraining, metrics like F1 Score and ROC-AUC, and the challenges of handling imbalanced datasets. To bring his points home, Erbin shared Pfizer’s journey in implementing a predictive maintenance model for their manufacturing equipment. Ensemble methods like Random Forests and Gradient Boosting significantly reduced equipment downtime, saving time and resources.

The conference then delved into data ethics and governance, a topic increasingly salient in the Big Data and AI era. Joris Krijger, an AI and ethics Specialist at de Volksbank, took the audience through the complexities of operationalizing ethical AI. He introduced the concept of “Differential Privacy,” a technique that allows companies to share aggregate data about user habits while maintaining individual privacy. The implications of Differential Privacy are profound, especially for industries that handle sensitive data, such as healthcare and finance. It offers a pathway to leverage the power of data analytics without compromising on ethical considerations. Joris cited de Volksbank’s experience implementing Differential Privacy in their data analytics workflows, demonstrating its viability as a privacy-preserving mechanism.

As the conference concluded, the focus shifted to the future of analytics and AI in enterprise settings. Logan Havern, CEO of Datalogz, discussed the challenges data-mature enterprises face in enabling self-service reporting. He introduced the audience to “Data Lake Architecture,” a scalable and flexible data storage solution consolidating structured and unstructured data. Logan emphasized the role of “Data Governance” in ensuring the quality and reliability of data stored in Data Lakes. He also discussed the concept of “Data Lineage,” which involves tracking the flow and transformation of data as it moves through various stages of a data pipeline. Understanding Data Lineage is crucial for ensuring data integrity and compliance with regulations like GDPR. To illustrate the practical applications of these concepts, Logan shared how Datalogz implemented a Data Lake solution that enabled real-time analytics. By using advanced “ETL (Extract, Transform, Load) Processes” and “Stream Processing,” they were able to provide actionable insights to their clients, thereby driving data-informed decision-making.

Haider Alleg

Optimised Data & Analytics: Unlocking the Future

As Leonid Pavlov took the stage, the room was filled with anticipation. Pavlov, a seasoned Data & AI training expert, set the tone for the day with a compelling narrative that underscored the transformative power of data and artificial intelligence. He spoke as a technologist and a visionary, painting a picture of a world where data isn’t just a byproduct of business operations but the lifeblood that fuels innovation and growth. As Pavlov sees it, the future is one where organizations will be divided into two categories: those who understand the strategic value of data and those who are left behind. This bifurcation is not a mere hypothesis; it’s a reality already taking shape. Companies like Amazon and Google have been pioneers in leveraging data to drive decision-making, and their success serves as a testament to the power of a data-driven approach. For businesses in regulated industries, the stakes are even higher. Data isn’t just a strategic asset; it’s a compliance requirement. The General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States are just the tip of the iceberg. Future regulations will likely be even more stringent, making data management a critical business function.

Mahmoud Yassin, a Senior Data Manager at Booking.com, took the stage with a sense of urgency that immediately captured the audience’s attention. His focus was on a topic that, while not as glamorous as machine learning algorithms or data visualization techniques, is foundational to any severe data endeavour: data lineage. Yassin began by introducing the concept of “Data Provenance,” a term that refers to a data set’s origin and subsequent transformations. This isn’t merely a technical exercise; it’s a critical business function. In a world where data is increasingly being used to make high-stakes decisions, the ability to trace the lineage of that data is not just a ‘nice-to-have’ but a ‘must-have.’ The implications of this are far-reaching. In regulated industries like healthcare and finance, understanding data lineage is not just beneficial; it’s legally required.

Basil Faruqui, Director of Application and Workflow Orchestration at BMC, took the stage. Faruqui is a known thought leader in the realm of DataOps, a burgeoning field that aims to bring the rigour of DevOps to the world of data analytics. His presentation was a deep dive into organisations’ complexities and challenges when transitioning from data collection to data utilization—a process often fraught with pitfalls and roadblocks. Faruqui began by laying out the stark reality: despite the billions of dollars invested in data initiatives, many of these projects never make it to production. Citing industry estimates, he noted that only about 15% of modern data initiatives are successfully operationalized. This is not just a failure of technology; it’s a failure of process and, ultimately, of vision. The crux of the issue, as Faruqui sees it, lies in the lack of orchestration. An orchestra’s success in music depends on the conductor’s ability to coordinate a diverse group of musicians. Similarly, success hinges on orchestrating complex operations that span multiple departments, technologies, and even organizational boundaries in the world of data.

Moderated by Pietro Bertazzi, Global Director of Policy Engagement and External Affairs at CDP, the discussion promised to be a deep exploration of one of the most pressing issues in the data world: maximising the utility of data across an organization. The panellists, hailing from diverse industries, were united in their belief that data is not just a technical asset but a strategic one. Ashraf Ali K M, Director of Data Science & Analytics at Footlocker, emphasized the need for a holistic business strategy where data is integral to the value chain. He argued that data should not be confined to isolated departments but flow freely across business units to drive decision-making and innovation. This is not just a matter of technical integration; it’s a cultural shift. Giuseppe Lenci, a Business Intelligence Specialist at Van Oord, pointed out that breaking down data silos requires a change in mindset. Organizations must move from a culture of data ownership to one of data stewardship, where data is seen as a shared resource that can benefit all departments. The panel also delved into the practicalities of aligning data with specific business goals. They discussed the importance of understanding seasonal buying cycles, customer demand, and other market dynamics. The ability to align data analytics with these factors can be a game-changer, enabling companies to anticipate market trends and adjust their strategies accordingly. But what really stood out was the panel’s focus on real-time intelligence. In a world where market conditions can change in the blink of an eye, the ability to make data-driven decisions in real time is not just an advantage; it’s a necessity. This is particularly true for companies in regulated industries, where the cost of failure is not just financial but can also include legal repercussions and reputational damage. The panel’s insights were a clarion call for organizations to rethink their approach to data. It’s not enough to collect data; companies must also have the systems, processes, and culture to make the most of this invaluable asset. Those who do will be well-positioned to navigate the complexities of today’s rapidly evolving business landscape.

Ranade began by acknowledging a common pitfall in the analytics journey: the issue of data quality. He argued that many organizations are so focused on collecting data that they overlook the importance of its quality. Poor data quality undermines analytics efforts and can lead to misguided business decisions. This is particularly crucial in regulated industries, where data integrity is not just a best practice but often a legal requirement. However, Ranade’s approach to solving this issue was far from conventional. He introduced the concept of a “Multi-horizon Approach” to analytics. This framework allows organizations to make meaningful steps in their analytics journey without getting bogged down by data quality issues. The idea is to start small but think big, to make incremental improvements while keeping an eye on long-term goals. This approach is particularly relevant for companies in the process of digital transformation. Organizations moving from legacy systems to more modern infrastructures often encounter many challenges, from data migration issues to cultural resistance.

Moderated by Bas Vertelman, a Real-Time Big Data Software Engineer, the discussion promised to delve into one of the most critical aspects of modern data analytics: real-time intelligence. Jeltsin Neckebroek, Head of DTC eCommerce at AB InBev, set the stage by highlighting the transformative power of real-time data. In an era where consumer preferences can shift overnight, the ability to adapt in real-time is not just a competitive advantage; it’s a business imperative. Neckebroek shared how AB InBev uses real-time analytics to optimize its supply chain, adjust pricing dynamically, and predict consumer behaviour. This level of agility is particularly crucial in regulated industries, where delays can result in lost revenue and compliance issues. Tarun Rana, Head of Data and Analytics at Henkel, took the conversation a step further by discussing the challenges posed by legacy systems. Many organizations are hamstrung by outdated infrastructures not equipped to handle real-time data processing. Rana emphasized the need for a robust data architecture to support the high velocity, volume, and variety of real-time data. He introduced the audience to the concept of “DataOps,” a set of practices and tools that enable agile data operations, echoing the principles of DevOps in the software development world. However, the panellists unanimously believed that technology alone is not the answer. Natasha Govender-Ropert, ING’s Senior Global Data Science Manager, spoke eloquently about the human element. She argued that real-time intelligence is not just about having the right technology; it’s also about having the right people and processes in place. This involves training staff to think in real time to make decisions quickly and confidently based on the data.

Timea Töltszéki, Head of Data and Platforms at Boehringer Ingelheim, provided a compelling example from the healthcare sector. She discussed how real-time analytics revolutionise patient care, enabling doctors to make life-saving decisions based on real-time data. This is a vivid illustration of the panel’s overarching message: that real-time intelligence has the potential to not just transform businesses but also to improve lives.

The panel’s insights were a wake-up call for organizations to embrace the ‘now.’ In an increasingly volatile, uncertain, complex, and ambiguous world, the ability to respond in real-time is not just a nice-to-have; it’s a must-have. Those who can master this capability will be well-positioned to thrive in the fast-paced world of tomorrow.

Rowan van Dongen, a Consultant in Manufacturing Analytics at AVEVA Select Benelux, took the stage to discuss a topic that has been the subject of much debate but little action: leveraging cloud technology for sustainable optimization in manufacturing. The cloud has long been viewed as a data storage and computational power tool, but Rowan presented it as far more strategic. He argued that the cloud is not just a technological tool but a strategic enabler, especially in sectors like pharmaceuticals, where the stakes are high and the regulatory environment is stringent. Rowan’s discussion was not a mere overview; it was a deep dive into the economics of responsible business. He cited real-world examples from the manufacturing sector where cloud-enabled analytics had led to a 20% reduction in energy costs. This is not just about corporate social responsibility; it’s about the bottom line. Rowan’s insights were both timely and timeless in industries where every percentage point of operational efficiency can translate into significant financial gains. As Rowan emphasized, the cloud can be the linchpin in a strategy aimed at sustainable growth. His insights were a clarion call for industries to reevaluate their digital strategy, not as isolated IT initiatives but as core components of their business models. He argued that the cloud could be the key to unlocking new levels of efficiency and sustainability, especially in regulated industries where compliance and performance are often seen as opposing forces. Rowan’s insights were particularly relevant for companies at the intersection of technology and regulation. He demonstrated how cloud technology could be utilized to meet and exceed regulatory requirements, thereby turning a potential obstacle into a competitive advantage. This is especially crucial in industries like pharmaceuticals and healthcare, where compliance is non-negotiable, and the cost of failure is astronomical. In a world where sustainability is no longer a buzzword but a business imperative, Rowan’s presentation offered a roadmap for companies looking to navigate the complexities of modern manufacturing. His discussion was a masterclass in approaching cloud technology not as an IT expense but as a strategic investment that could yield dividends in terms of operational efficiency, corporate reputation, and customer trust. Rowan’s presentation was a watershed moment in how we think about the role of technology in sustainable business practices. It was a call to action for all stakeholders, from C-suite executives to frontline workers, to rethink their approach to technology, sustainability, and the intricate web of regulations that govern modern industries. His insights were not just a commentary on the state of manufacturing but a blueprint for its future, one where technology and sustainability are aligned and integrated in a seamless, almost symbiotic, relationship.

Timea Töltszéki, Head of Data and Platforms at Boehringer Ingelheim, took the stage to address a topic that has been on the minds of many but articulated by few: data monetisation. In an era where data is often called the “new oil,” Timea’s insights offered a nuanced perspective beyond the usual platitudes. She delved into the complexities of understanding the actual value of data before even attempting to put a price tag on it. This is a critical consideration, especially for companies in regulated industries where data has intrinsic value and carries significant compliance and ethical implications. Timea’s presentation was a compelling narrative that wove together data monetisation’s technical, strategic, and ethical dimensions. She cited the example of a recent project at Boehringer Ingelheim, where data monetization strategies led to a 15% increase in operational efficiency. But she quickly pointed out that this was not just about financial gains but creating a holistic business strategy where data is an integral part of the value chain. This is particularly relevant for companies in the pharmaceutical sector, where data can catalyse innovation, driving advancements in drug discovery, patient care, and even regulatory compliance. The concept of KPIs, often relegated to performance metrics, took on a new significance in Timea’s discussion. She argued that KPIs could be a compass, guiding companies in their data monetization efforts. By aligning KPIs with broader business objectives, companies can create a roadmap for data monetization that is financially lucrative, ethically sound, and strategically aligned with long-term goals. Timea also touched upon the pillars of discoverability and data quality, emphasizing that these are not mere technical considerations but strategic imperatives. In industries like healthcare and finance, where data accuracy is not just a quality metric but a legal requirement, the importance of data quality cannot be overstated. Timea’s insights served as a reminder that in the quest for data monetization, quality and ethics are not just checkboxes to be ticked but are core to the very fabric of a sustainable data strategy. Her presentation was not just an academic exercise; it was a strategic framework for companies grappling with the complexities of data monetization in a rapidly evolving landscape. Timea’s insights were a call to action for business leaders to view data not just as a commodity to be sold but as a strategic asset to be leveraged. Her talk was a seminal moment in the ongoing dialogue about the role of data in modern business, offering a nuanced and multidimensional perspective often missing in discussions about data monetization.

Product Barcamp: A Confluence of Strategy, Technology, and Networking

Haider Alleg

The audience, a mix of tech enthusiasts, business leaders, and strategists, leaned in, ready to delve into a subject that promises to redefine the very fabric of business operations.

My presentation began by setting the stage for the transformative potential of Large Language Models. These are not mere tools for automating routine tasks; they are, in essence, a new form of intelligence that can augment human capabilities in unprecedented ways. Take customer service, for example. The margin for error is minimal in industries such as healthcare and finance, bound by stringent regulations. Large Language Models can sift through the rules and guidelines to provide accurate and compliant responses, freeing human agents to tackle issues requiring emotional intelligence and nuanced understanding.

However, the true disruptive power of these technologies lies in their ability to generate novel insights from existing data pools. I shared an example from a project in the pharmaceutical sector where a Large Language Model was deployed to analyze a vast corpus of medical literature. The model identified potential drug interactions that had eluded human researchers, opening new avenues for drug development and patient care. This is not mere automation but innovation, which can shift paradigms and create tangible value for stakeholders.

The audience was visibly engaged, nodding in agreement and jotting down notes. The following questions were insightful, probing the ethical considerations of AI and the challenges of integrating these technologies into existing IT infrastructures. Evidently, the implications of Large Language Models and Generative AI resonated with the audience, not as abstract concepts but as real-world solutions with the potential to drive internal and external transformation.

Garg began by dissecting the common misconceptions that plague product strategy. He argued that a successful strategy is not just about having a great product; it’s about understanding the ecosystem in which that product exists. This involves a deep dive into customer behaviour, market trends, and geopolitical factors that could influence product adoption. He cited the example of a European tech company that failed to gain traction in Asian markets, not because of product inferiority but due to a lack of understanding of local consumer behaviour and the regulatory landscape. This led to a discussion on the role of data analytics in shaping product strategy. Garg emphasized that data is a byproduct of business operations and a critical asset that can drive strategic decisions. He shared insights into how advanced analytics tools can help organizations move beyond surface-level metrics to uncover deeper patterns and trends. For instance, machine learning algorithms to analyze customer reviews and social media mentions can provide invaluable insights into customer pain points and preferences, which can be leveraged to refine product features or develop new offerings. The conversation then shifted to the panel discussion, where I had the opportunity to engage with Garg and the audience. The questions were incisive, reflecting the depth of the topic and its broad-ranging implications. One question that stood out was about the role of organizational culture in shaping product strategy. Garg and I agreed that culture is not a peripheral but a core strategy component. A culture that fosters innovation, values data, and is agile enough to adapt to market changes is more likely to succeed in crafting a winning product strategy.

The Intersection of Innovation, Regulation, and Scalability

After chairing and participating in these transformative summits, I find myself revisiting the question that initially guided my preparation: What are the real challenges and opportunities that AI, machine learning, and large language models present to organizations today? The insights gleaned from the summit have only deepened my understanding of this complex landscape, and I believe we are at a critical juncture that demands thoughtful navigation.

The rapid pace of technological advancement creates a form of “technological debt” for organizations, particularly those in regulated industries. This debt manifests as a widening gap between what technology can offer and what organizations are capable of implementing. The summit’s discussions on data lineage, operationalization, and real-time intelligence have underscored the urgency of this issue. Organizations must understand and control the value chain that impacts customer behaviours. Data, when adequately deciphered, can provide invaluable insights into this chain. However, the full potential of data as a currency remains largely untapped. The advent of AI and machine learning technologies offers a way to navigate this complex data landscape more efficiently. Yet, as the summit’s panel discussions highlighted, the real breakthrough will come when organizations fully grasp the concept of data as both stock and flux. This will lead to transformative moments that could redefine industries, like how Tesla’s self-driving capabilities have disrupted the automotive sector.

The workforce landscape is also undergoing a significant shift, a point that was palpably evident during the summit. We have emerging leaders focused on forward-thinking and strategic planning on one end of the spectrum. Conversely, a burgeoning gig economy is populated by freelancers constantly innovating and adopting new skills. This dichotomy presents a unique challenge for regulated industries, where the culture of failure is not yet the norm. Traditional leadership roles, often characterized by risk-averse behaviour, are increasingly at odds with the fast-paced, innovative culture of the startup ecosystem. The disconnect is further exacerbated by the lack of a unified approach to innovation, as evidenced by the diverse range of tools and methodologies employed across organizations. This lack of standardization hampers scalability and makes it challenging to build strong governance structures.

The summit also touched upon the evolving role of brands in this high-velocity landscape. The push for greater visibility has led some brands to prioritize quantity over quality, resulting in diluted messaging. This trend was evident in the discussions on data monetization and manufacturing analytics, where the focus has shifted from crafting a compelling narrative to simply being seen. However, as the summit’s presentations on AI and large language models highlighted, disruptive innovation is becoming the new norm. Brands that wish to maintain their positioning for the long term are adopting a more measured approach, leveraging new technologies to create more targeted and impactful campaigns.

These 2 summits have reinforced my belief that we are at a critical crossroads. The challenges are manifold, but so are the opportunities. As someone who operates at the intersection of innovation, regulated industries, and the startup ecosystem, I see a pressing need for a new breed of leaders. These leaders must be capable of bridging the strategic imperatives with unique operational and tactical activities. They must be tech-savvy and adaptable, with a deep understanding of data as a currency and technology as an enabler. Only then can organizations hope to navigate this complex landscape successfully and seize the opportunities that lie ahead.

Leave a Reply

Your email address will not be published. Required fields are marked *

Sign-up for my newsletter

Haider Alleg

TechEx Event

Haider Alleg

NextPharma Dubrovnik 2020

Haider Alleg

EyeforPharma Tokyo 2020

Haider Alleg

TechEx Event 2022