Browse articles, resources, and the latest product updates.
Data is critical in order for an organization to make informed decisions. Both data consolidation and integration allow organizations to analyze all aspects of their business in order to make improvements. Data integration involves combining data, systems, and platforms to create a unified perspective of all data. The goal of data integration is to harmonize elements of important data sets and to create a process of data merging and data unification. This creates a seamless process that allows organizations to efficiently access data and share it across the business to improve their insights. In contrast, data consolidation is bringing data from multiple sources into one cohesive framework. The process involves combining data from various sources, locations and systems to reduce redundancy and increase efficiency. The purpose of data consolidation is to reduce data duplication and help to facilitate easier data access as well as management. Both data integration and consolidation involve combining data from multiple sources. However, the main difference is that data consolidation is combining data from different locations, while data integration focuses on centralizing data and reducing redundancy. There are many challenges and solutions that are related to both of these approaches that are outlined below. Data Integration Challenges and Solutions Data integration is a concept that benefits businesses because it allows them to get a bigger picture of what is happening within the organization. Many key issues that companies face can be analyzed using data integration. For example, if your company is sending out a survey to its customers, but only half are completing the survey, you have data that will be able to pinpoint why this may be occurring through data integration. Data integration challenges are important to resolve because it allows organizations to improve the overall customer experience and journey while using the product or service. By combining data from multiple sources, businesses capture a real-time view of the overall performance of specific strategies. Using Manual Data IntegrationTraditional manual data methods, such as spreadsheets and pivot tables can be useful for smaller businesses, but introduce various issues. This includes not being able to scale these methods as the organization grows, confusion about sharing data from different departments, and a large amount of time spent on data integration. The solution to this would be to use an automated data integration tool that collects data in real time and processes it when you need it. Lack of Scalability As a business grows, it can be difficult for data integration tools to keep up in terms of scalability. This causes an issue for large businesses that want to keep up with large influxes of data efficiently. This can be solved by using a data integration tool that can be adaptable to changes within a business. Preparing for business expansion ahead of time can also be beneficial. For example, if you know your company will acquire another business, choose the strongest data points from those businesses and integrate them with your own ahead of time. Low quality dataIf the quality of data is low, integration tools will not be able to analyze it or put it to use when evaluating business strategies. A solution to this issue is implementing data quality management. Using data management tools helps to check that the data is free of errors. These tools can help businesses to understand the state of data, validate the data before it is loaded and also ensure that an organization is only using data that is usable and error free. Lack of PlanningBefore beginning data integration or using data integration tools, it is important for businesses to assess the current state of their goals. Organizations need to understand what exactly they are integrating and how this data can be useful. Many businesses do not understand the importance of data integration or the tools that are needed to help them to reach their specific goals. This challenge can be addressed by conducting research prior to integrating any tools and determining which ones are most beneficial to your industry. Duplicated Data Duplicated data is an issue that 94% of businesses are suffering from and can result in an organization losing money and time. For instance, repetitive marketing campaigns or constantly contacting a customer when they already have been contacted are issues that many businesses face. Data duplication challenges the core purpose of data integration, which is to have a single customer view to help improve the overall buyer experience. A solution would be to use de-duplication tools and features that focus on the ability to merge data, perform quality checks, and recognize when information is either missing or has been duplicated. Data Consolidation Challenges and SolutionsData consolidation is important because it helps to ensure that an organization uses high quality data to make informed decisions. This practice is rooted in transforming the data into a consistent format, allowing a business to have the opportunity to analyze the data and improve the quality of it prior to it being used. It also ensures that companies have an accurate data set from every source. Both data consolidation and integration are useful for any organization because they help to provide insight into multiple aspects of the business. Specifically, data consolidation allows companies to manipulate different points of data into insights that can help an organization improve its business. Limited ResourcesData consolidation can be lengthy and require an entire team to integrate the process. It can be difficult for organizations to dedicate a large amount of time or figure out how to best use resources. The best solution to this would be to create an extensive plan prior to data consolidation. This can include figuring out what individuals within a department or team are assigned to the task as well as determining constraints and strengths with regard to resources. Data Security Data security can be a concern for businesses because of data centralization, meaning all the data is in one place. Though this provides many benefits in terms of organization and accessibility for an organization, it can present a security risk. Following up on the latest security measures and implementing firewalls can help to minimize these risks that are associated with security. Consulting with data security experts to address any potential threats prior to data consolidation can also minimize problems in the future. Data Latency When using a central source, data latency can be involved. An organization may not be viewing the most accurate representation of the data because it can take time to retrieve data from the original source and then transfer it to the central target. To solve any issues related to data latency, it is best to do frequent data transfers. This can be done by communicating a clear schedule to a designated team. Data Incompatibility Data that is generated from multiple sources is not often formatted correctly and can have issues when being used with ETL tools. Since all of this data is consolidated in a single location, it must be transformed all together to be used within analytics. A solution to this problem would be hand-coding the data in order to ensure that all data sets have consistent formatting.
Lytics, Wpromote, Google Cloud. 2Gather Los Angeles June 6th 2023 Buzz Hays, Global Lead Entertainment Industry Solutions and Iman Ghanizada @iman, Global Head of Autonomic Security at Google Cloud opened the event by discussing that the purpose of AI is to improve what people are already doing. Whether they are writers or animators in a designated industry, AI aims to enhance the paintbrush for an artist. With trying to provide businesses with better tools, many questions surrounding security and data arose. One major question was regarding how to collect effective data that would result in projects using AI. A primary example that was discussed during the event was the entertainment industry. Many applications of AI within this industry need a sufficient amount of customer data to be developed. For the entertainment industry, identifying ad breaks and suggested content for streaming platforms are examples of AI use cases. Jascha Kaykas-Wolff from Lytics stated that mature organizations can adapt to data pipelines. Working across different departments makes the decision making process a lot easier, because it demonstrates how data is useful to certain parts of the organization. Paul Dumois , the CTO of Wpromote, also stated that businesses need to focus on specific problems to solve and retrieve data that will be helpful in providing solutions to these issues. Overall, the discussions between the panel and the audience highlighted that AI has many moving parts and trends. An organization should focus on a specific area and start with a singular project to learn about the challenges and dynamics of working with AI in real time. Additionally, analyzing the core metrics of a business and receiving top-down support can help to utilize resources when setting up projects or tasks associated with AI.
C2C’s first event for developers took place on April 26th, 2023 in Sunnyvale, CA. The event focused on data analytics and how organization can optimize their data. Below are some data buzzwords and their definitions, an overview of Dataplex, a product that was demonstrated at the event, and a summary of the key topics discussed. Data warehouse: A system that is used for reporting and data analysis. A data warehouse is a large storage of data that has been accumulated from a range of sources and helps businesses with decision-making processes.Data lake: A centralized infrastructure that is designed to store and process large amounts of data. A data lake can store data in its original form and process it in any variety. Data lakes are scalable platforms that allow organizations to ingest data from any source at multiple speeds. Data Lakehouse: A modern data platform that is a combination of a data warehouse and a data lake. BigQuery: Serverless architecture built as a data warehouse that works across clouds while scaling with your data. BigQuery allows users to pick the right feature set for workload demands and can match these needs in real time. It can also analyze data across multiple clouds and securely exchange data sets internally or across businesses, making it a platform with scalable analytics. BigLake: A storage engine that unifies data warehouses and lakes through BigQuery to gain access to data. DataplexDataplex is a lake administration and data governance tool. It enables organizations to discover, manage and evaluate their data across data lakes and data warehouses. Dataplex also has a variety of features that allow organizations to choose specific items to easily manage data. For example, the tag management feature ensures that specific users have access to the right data by setting policy templates and tags with different sets of data. Dataplex also has automated data quality management features. For example, if a report quotes incorrect numbers, the data can be corrected with automated data tools rather than manually. Data and and Real Time Analytics A major point raised at the developer’s event was that data is rooted in an event-driven architecture. For instance, customers who work in finance get highly interested in data in real time during specific periods. This interest is event-based, as it usually occurs when the industry reaches a quarter close. Moving data around can be a difficult task; however, there are certain cloud features that can solve this issue, such as Dataplex. The main concern surrounding organizing data is access control and governance. Customers want to know that steps have been taken to ensure that unauthorized users do not gain access to private data. Visibility and transparency are also core tenets when discussing access to data and its governance tools.
This week, Bruno shares lessons in entrepreneurship from Netflix, insights from the latest Gartner research and brings a special guest to talk about the Data Mesh.This CarCast covers:Why Netflix story is an example for culture and success of entrepreneurship. The lessons of different thinking and perseverance. Gartner's latest research shows that budgets are up and the latest Gartner data trends point to 3 themes: 1) "from platforms to ecosystems”, “don’t forget the humans” and “think like a business” Finally, if you'd like to connect with Bruno live, you can meet him this Wednesday at the Commonwealth Club at the Everyday AI event. For more, check out Bruno's blog here.Have a great week!
This week, Bruno shares best practices on leadership, key metrics on scaling startups and a few examples of organizations who have succeeded in data!This CarCast covers:Good Leadership vs Bad Leadership: Effective leaders map their decisions to principles. Effective leaders coach. Effective leaders shine a light on fears and worries. What do startup CEOs do? How long does it take to go IPO?! Did you know that the median equity raised through Series F is about $569M at a median post-money valuation of $2.9B? Did you know that It takes about 9 years to get to Series F, with a new round every 15–19 months. Stories of great leaders: From PayPal to SquareSpace, leaders share their best practices in data here.Finally, if you'd like to connect with Bruno live, you can meet him at the EVERYDAY AI event on April 19, 2023 at the Commonwealth Club, San Francisco!Have a great week! For more, check out Bruno's blog here. BestBruno
This week, Bruno breaks down Data Quality, shares the latest in tech spending and highlights the success of key data leaders you should know and follow.This CarCast covers:Data Quality attributes: Only about 1/3 of data and analytics organizations get value from their data. Why? Data quality. Bruno gives you a set of questions you can ask to assess the quality of your data...questions are about Data itself, People...and the Actions taken on the data...the best acronym he came up with was DPA for DQ. Do you have a better one?! Cloud Software Spending, March 2023: 3 key insights you can afford to miss (budget split between people and tech, priority hiring and priority investments) Stories of great leaders: From Orange France to Richemont, to Cartier to Geotab, learn about the stories of data leaders in your field.Finally, this coming week, you'll have two ways to connect with Bruno, in person at Data Cloud Live Summit in Toronto and online where he'll discuss Harvard Business Review's latest stats on Data, AI and Analytics. Have a great week! For more, check out Bruno's blog here.
This week, Bruno demystifies the definition of innovation. What is it and how should you evaluate it? He quotes the great Clayton Christensen and points to his book, the Innovator's Dilemma.This CarCast covers:Highlights of the Gartner Data & Analytics Summit 2023: did you know that only 34% of D&A organizations are consistently able to produce clear business value?! The Top 10 Data, AI and Analytics Trends: from the Modern Data Stack under pressure to the new political economy of AI. The “Lift and Shift Shot Clock”: The longer you hold onto legacy practices in the new game of cloud computing, the less likely you’ll win.For more resources, links and photos, check out Bruno's blog here.
This week, Bruno breaks down what matters and what doesn't matter in the world of Data (he's brought a special guest to help him out too!). This week, Bruno was featured in Harvard Business Review and he points to some of the latest research from that piece: for instance, did you know that 81% of organizations have increased their data and analytics investment over the past two years?!This CarCast covers:What To Pay Attention To & What To Ignore. Gartner's 2023 Emerging Tech and Trends Impact Radar is out! The graphic and research behind it contains a lot of trends. One tip: look for the biggest bubble, closest to the center. What To Expect at the 2023 Gartner Data & Analytics Summit. Data Fabric, Data Products & Data Engineering are some of the trends to watch out for this week. Read Philip Russom's post to find out more here. Data Leaders You Can't Afford Not To Know or Follow. Bruno points to the incredible journeys of Data Leaders at Carrefour, L'Oreal, Yves Saint Laurent, Groupe Rocher, Kerin, Swarovski & Servier and ways to connect with them here, here and here!And finally, if you are planning on attending the Gartner Data & Analytics Summit at the end of the month, don't hesitate to connect with Bruno and let him know if you'd like to meet in person, at sessions or the various Data, AI and analytics gatherings throughout the week!Analytically yours,Bruno
This week, Bruno breaks down how you can tell when your company is ready to scale and points you to leaders you should know and follow so you too can grow!This CarCast covers:Leaders You're Going To Want To Know & Follow. Swarovski's VP of Data Fabrizio Antonelli and Cartier's Chief Data Officer Thomas Meyer are two data leaders who are making Machine Learning approachable and impactful with customers and employees. Follow them here and here! How To Know When Your Company is Ready to Scale. Bruno breaks down Stage 2 Capital Jay Po's on LTV and CAC. You can read more about here. Playbook for Growth: The 5 Questions You Need To Answer About Your Business. In this post, Bruno inspires from the Business Model Generation methodology to provide the 5 questions you need to ask about your business as you attempt to scale it!And finally, you are planning on attending the Gartner Data & Analytics Summit at the end of the month, don't hesitate to connect with Bruno and let him know if you'd like to meet in person, at sessions or the various Data, AI and analytics gatherings throughout the week!Analytically yours,Bruno
This week, Bruno talks about the 3 key attributes of modern Data Products, covers best practices in Data and points you to leaders you should know and follow so you too can grow!This CarCast covers:The Good, The Bad and The Ugly of Data. The data organization is now a value organization (70% of data leaders report to the company's president, CEO, COO or CIO) and it gives you, the data leader, the opportunity to align on business objectives, not just technical ones. Read more on VentureBeat here. How To Succeed as a Data Leader. Bruno reviews the do's and don'ts of Jaguar and Land Rover's former Data Chief. He talks about accountability and thoughtful planning. Data is more than just tech! Data products in action. For Bruno, Data Products are about Data, Time and People. Listen in to get the breakdown!And finally, you all liked Bruno's MAD interview so much, he's created a playlist! Check out the snippets and behind the scene short videos here! (btw - MAD stands for Machine Learning, Artificial Intelligence & Data Landscape!
An engaged audience eagerly listens as Sanjay Chaudhary, Vice President of Product Management at Exabeam explains how hackers are able to use MFA bombing to hack employee emails in order to gain confidential company information. This is one of many topics surrounding data optimization discussed at the 2Gather event in Sunnyvale, California on February 3rd. “Not coming from a technical background, I wasn’t sure what to expect at my first event. However, the panel’s rich and engaging narrative made data security into an amazing story to listen to!” said June Lee, Senior Program Manager at Workspot. The first C2C event of the year embodied the essence of forming meaningful connections. At the beginning of the event, all attendees were asked to introduce themselves to two other individuals they have not spoken to. This created a strong sense of openness and going beyond comfort zones to spark personable interactions. Through peer to peer conversation, guests connected on driving advocacy and feedback surrounding how to use Google Cloud in regards to data analytics. The event was composed of a diverse panel of Google partners including NetApp, Exabeam, Lytics as well as Cisco systems. “Everything starts with a customer,” stated Bruno Aziza (@BrunoAziza), the Head of Data and Analytics at Google. This approach is the driving force behind Google building close relationships with their customers, understanding their journeys and what challenges can arise, one of these being receiving value from data that has been collected. “A large amount of organizations are struggling to turn data into value and money is being spent on data systems, yet companies are not always benefiting from it” says Bruno. Organizations now have access to large sets of data, however, critical pieces of data are not typically within their internal environment. A step in the right direction is to create data products that assist with tackling this issue. One of the major keynote speakers, Vishnudas Cheruvally, Cloud Solution Architect at Netapp provided insight on solutions that the organization is working on. “One of the main goals of Netapp is to build an environment that is rooted in trust and to create an infrastructure where users do not have to worry about basic tasks associated with optimizing data,” says Vishnudas. Through billing API’s and resizing data volume with Google Cloud services, customers have accessible tools that allow them to make informed decisions. This includes creating a customized dashboard to observe what is happening within their environment. Along with data optimization, emerging global trends and the impact it has on data sovereignty was also a recurring topic that captivated the audience. “Data sovereignty and upcoming global trends within data security were key topics discussed at the event and are also motivating factors of solutions developed by Netapp,” stated Vishnudas. “Everything starts with a customer.” “An emerging trend is using excessive resources through multiple clouds and essentially creating a wasteland,” says Jascha Kaykas-Wolff (@kaykas), President of Lytics. This conversation sparked the topic of global trends, data sovereignty and cloud strategy. With high amounts of data being stored by organizations, questions begin to arise in regards to ownership. “Data has to live in a specific area and there has to be control or sovereignty over it,” says Jascha. The panel engaged in a conversation that covered dealing with shifting global trends and how it impacts customers. Sanjay Chaudary brings in a product management perspective, which is rooted in solving customer problems. “With more regulations being created, data cataloging is essential in order for customers to understand what is critical in terms of their data and security threats. The core principle of data is the same, the most important thing is being able to detect a problem with the data and how fast it can be addressed.” says Sanjay. From ownership to data security, the discussion highlighted a variety of fresh perspectives. What stood out amongst guests is the diversity of the panel that brought in differentiating views. “The event had extremely thought-provoking insights stemming from the issues of modern day data analytics and how it impacts a customer base as well as a panel that discussed their personal experiences with data,” said Dylan Steeg (@Dylan_Steeg), VP of business development at Aible. Both speakers and guests then attended a networking session following the event. Over refreshments and drinks, guests were able to mingle with one another to further expand the conversation. Most importantly, they were able to create meaningful connections. Connections that may lead to future collaborative efforts as well as identifying solutions that can take data optimization to new heights.You and your organization can also build these connections. To start, join C2C as a member today. We’ll see you at our next 2Gather event! Extra Credit:
Whether you are a data scientist or analyst, understanding BigQuery architecture provides insight on how organizations control costs and analyze data with built-in features. If you want to optimize your data sets through the scalable capabilities of BigQuery, listen to the stories of these four growing startups: Tinyclues, Aible, Connected-Stories, and Snorkel AI. The speakers discussed the following at this 2Learn event: How companies are able to scan a high volume of topics among users while also improving the user experience Understanding how BigQuery allows organizations to leverage data to develop strategies and optimize campaign performance Accessing knowledge on data-centric AI and its integration within a workflow Analyzing specific data sets that will provide the most valuable insight into market conditions and consumer behaviorWatch a full recording of this event below: Extra credit:
@Yuliia Tkachova, the founder and CEO of Masthead Data, discovered a data issue with the power to disrupt companies of all sizes. Find out why she decided to build a business to address the issue and the tips and best practices she has learned as a founder along the way.In this event, she and @hannahparker discuss the most common obstacles and challenges to data quality across businesses of all sizes, what makes data reliable and the importance of having reliable data at scale, how Yuliia navigated what she knew and what she didn’t know and the challenges she encountered on her journey, and tools for success and why Masthead Data relies on BiqQuery and the Google Cloud Platform.Watch the full recording here:
On June 14, C2C hosted an event in Google’s Cambridge office. We believe in-person connections are invaluable to everyone in our community, especially when our members are able to immediately converse with amazing speakers who are sharing their journeys and business outcomes.The stories from this event—presented on stage from Google Cloud customers, partners, and employees—can all be reviewed below. Introduction from Google Yee-chen Tjie (@yeetjie), Google Cloud Life Sciences Head of Customer Engineering, kicked off the program at C2C Connect Live: Cambridge with a few words about how Google is using 10x thinking to make major unique and substantial investments in Healthcare and Life Sciences technology. Tjie made a point of mentioning Google’s record of solving problems using AI and ML, particularly with AlphaFold 2, the focus of the presentation Luke Ge of Intel gave later in the afternoon.After his opening remarks, Tjie hosted a round of Google trivia, inviting everyone in the audience to stand and then sit down every time they answered one of his true-or-false questions incorrectly. After guessing whether Google Suite was initially offered on CD in 2006 (false), the first Google Doodle was about Coachella because the founders were going (false––they were going to Burning Man), and the English translation of Kubernetes is “cargo ship” (false––it’s “pilot”), Tjie handed the lucky winner a free Google hub device. CISO Healthcare and Life Sciences Reflections Before beginning his presentation, Taylor Lehmann (@taylorlehmann1), Director of the Office of the CISO at Google Cloud, thanked the hosts for the opportunity to join and speak, noting that he had just had his “mind blown” talking to fellow presenter Jonathan Sheffi before the event. Lehmann went on to discuss some of the core principles of invisible security, and his office’s mission to “get to this vision where security is unavoidable.” A big part of this project, he explained, is eliminating the shared responsibility model in favor of what Google calls “shared fate.” Under this model, Google provides blueprints, solutions, and curated patterns to enable customers to manage their own security infrastructures. “If you have a bad day on Google Cloud, it’s a bad day for us too,” he summarized. “If you win on Google Cloud, you win too.” The History and Future of Human Genomics Jonathan Sheffi (@sheffi) formerly a Director of Product Strategy at Veeva Systems and Google Cloud, began his presentation by prodding the audience with an enthusiastic “How’s everyone doing?” and then added “First rule of public speaking, make sure the audience is awake.” The focus of Sheffi’s presentation, the history and future of human genomics, took the audience back to the year 1990, when, in Sheffi’s words, “Nirvana’s Nevermind is a year from coming out, it’s a very exciting time.”Sheffi went on to cover the advents of next-gen sequencing and of public cloud computing, government and pharmaceutical adoption of genomic sequencing, and recent cost-cutting advancements in genomics. When he opened things up to the audience for questions, Michael Preston of Healthcare Triangle shared his own experience seeking treatment for melanoma to ask how genomic sequencing can be used to predict patient reactions to prescribed medications. Sheffi took the question to heart, and acknowledged the need for sequencing and screening processes that take into account data on patient-reported side effects. End-to-End Optimization of AlphaFold2 on Intel Architecture Luke Ge (@Liangwei77ge) an AI Solution Specialist at Intel, opened his presentation by saying, “Yesterday I spent 6 hours on a plane to come to this beautiful city,” prompting a round of applause form the audience. Then he asked “How many of you are using Alphafold 2?” A few hands went up. He followed up with, “How many of you have heard of Alphafold 2?” Many more hands raised.Ge’s presentation explored how analyzing human tissue from DNA to protein structure requires using AI for processing huge sequence data. The Google product that handles this processing is AlphaFold 2. Ge explained how Intel’s computing hardware supports Alphafold 2, including by providing a deep Learning model inference and removing memory bottlenecks in AlphaFold 2’s attention and evoformer modules. At the end of his presentation, Ge demonstrated a model generated using non-optimized versus optimized Alphafold 2 code. The difference was clear. Panel Discussion Tjie moderated the panel discussion with Sheffi and Ge by asking each whether he is a Celtics fan or a Warriors fan. Immediately, the tension in the room rose: Sheffi and Ge are from opposite coasts, making Sheffi a Celtics fan and Ge a Warriors fan. The tension was short-lived, however. When Tjie asked Ge what he considers the best way to choose a compute instance, Sheffi followed up to ask Ge if it’s possible to run multiple sequences on a single instance and maintain performance. Ge said yes.When Tjie opened questions to the audience, several guests rose to ask Sheffi questions about genomic sequencing, more than one of them focusing on use cases for genomic research for patients and caregivers. After several of these questions in a row, Tjie turned to the crowd and said, “I warned Luke that if he picked the Warriors then he would get less questions from the audience.” After the laughs in the room died down, Tjie asked Ge where he sees HCLS problems being solved with AI. Ge did not have to think long before citing computer vision as a solution for detecting cancerous cells. Winding Down Following the presentations, all in attendance broke away to connect during a networking reception. To read more about it, check out the exclusive onsite report linked below in the Extra Credit section. Extra Credit
On May 12, C2C hosted its first east coast event at Google’s New York office. We believe in-person connections are invaluable to everyone in our community, especially when our members are able to immediately converse with amazing speakers who are sharing their journeys and business outcomes.The stories from this event—presented on stage from Google Cloud customers, partners, and employees—can all be reviewed below. A Warm Welcome from C2C and Google Cloud Opening the event was Marco ten Vaanholt (@artmarco), who leads C2C initiatives at Google Cloud. To kick things off, Marco prompted the audience to get to know each other, and all enthusiastically turned to their table neighbors. After Marco covered the history of C2C and our early adventures in hosting face to face events, Marcy Young (@Marcy.Young), Director of Partnerships at C2C, followed to reiterate our mission statement: we’re here to connect Google Cloud customers across the globe. Since March of 2021, when the C2C online community first launched, our community has grown in size to make valuable connections with people like Arsho Toubi (@Arsho Toubi), Customer Engineer, Google Cloud, who followed Young to introduce C2C’s partner speakers.All three introductory speakers emphasized the excitement of being able to make new connections in person again. As ten Vaanholt put it, peers introducing themselves and initiating new relationships is “the start of community building.” When Toubi announced “I received some business cards, and that was a fun experience I haven’t had in two years,” the room responded with a knowing laugh. Toubi also asked the Googlers in the room to stand up so others could identify them. “These are my colleagues,” she said. “We’re all here to help you navigate how to use GCP to your best advantage.” Getting to Know AMD and DoiT C2C partners and the sponsors for this event, DoiT and @AMD shared updates of the partnership between the two companies focused on cloud optimization.Michael Brzezinski (@mike.brzezinski), Global Sales Manager, AMD Spenser Paul (@spenserpaul), Head of Global Alliances, DoiTBrzezinski framed the two presentations as a response to a question he received from another attendee he met just before taking the stage, a question about how the two companies work together to enhance performance while reducing cost. One half of the answer is AMD’s compute processors, which Brzezinski introduced one by one. To complete the story of the partnership between the two companies, Spenser Paul of DoiT took the stage with his Labrador Milton. “I’m joining the stage with a dog, which means you won’t hear anything I’m saying from here on,” he said as he took the microphone. “And that’s totally okay.” The key to minimizing cost on AMD’s hardware, Paul explained, is DoiT’s Flexsave offering, which automates compute spend based on identified need within a workload. A Fireside Chat with DoiT and CurrentSpenser Paul, Head of Global Alliances, DoiT Trevor Marshall (@tmarshall), Chief Technology Officer, CurrentPaul invited Marshall to join him onstage, and both took a seat facing the audience, Milton resting down at Paul’s feet. After asking Marshall to give a brief introduction to Current, Paul asked him why Current chose Google Cloud. Marshall did not mince words: Current accepted a $100,000 credit allowance from Google after spending the same amount at AWS. Why did Current stay with Google Cloud? The Google Kubernetes Engine. “I like to say we came for the credits, but stayed for Kubernetes,” Marshall said. Paul wryly suggested the line be used for a marketing campaign. The conversation continued through Current’s journey to scale and its strategy around cost optimization along the way.When Paul opened questions to the audience, initially, none came up. Seeing an opportunity, Paul turned to Marshall and said, “Selfishly, I need to ask you: what’s going to happen with crypto?” Just in time, a guest asked what other functionalities Current will introduce in the future. After an optimistic but tight-lipped response from Marshall, another moment passed. Marshall offered Paul a comforting hand and said, “We’re all going to make it through,” before fielding a few more questions. Panel Discussion All our presenters, with the addition of Michael Beal (@MikeBeal), CEO, Data Capital Management reconvened on stage for a panel discussion. Toubi, who moderated the conversation, began by asking Michael Beal to introduce himself and his company, Data Capital Management, which uses AI to automate the investment process. Beal ran through Data Capital Management’s product development journey, and then, when he recalled the company’s initial approach from Google, playfully swatted Marshall and said, “The credits don’t hurt.” Toubi then guided Beal and Brzezinski through a discussion of different uses cases for High Performance Computing, particularly on AMD’s processors.When Toubi turned the panel’s attention to costs, Paul took the lead to explain in practical detail how DoiT’s offerings facilitate the optimization process. “I have an important question,” said Toubi. “Can DoiT do my taxes?” Then she put the guests on the spot to compare Google Cloud to AWS’s Graviton. Brzezinski was ready for the question. The initial cost savings Graviton provides, he explained, don’t translate to better price performance when taking into account the improved overall performance on Google Cloud. Other questions covered financial services use cases for security, additional strategies for optimizing workloads for price performance, and wish-list items for Google Cloud financing options.Marco ten Vaanholt kicked off the audience Q&A by asking what a Google Cloud customer community can do for the customers on the panel. Marshall said he’s interested in meeting talented developers, and Beal said he’s interested in meeting anyone who can give him ideas. As he put it, “Inspiration is always a very interesting value proposition.” After a couple more questions about estimating cost at peak performance and addressing customer pain points, Toubi asked each panelist to offer one piece of advice for someone considering using Google Cloud who isn’t already. Again, Paul saw a shot and took it. “If you’ve never been to Google before,” he said, “Come for the credits, stay for the Kubernetes.” Winding Down Following the presentations, all in attendance broke away to connect during a networking reception. To read more about it, check out the exclusive onsite report linked below in the Extra Credit section, and to get involved in the customer-to-customer connections happening in person in the C2C community, follow the link to our live event in Cambridge, MA to register and attend. We look forward to seeing you there! Extra Credit
Looker is a business intelligence platform used for data applications and embedded analytics. Looker helps you easily explore, share, and visualize your company's data so that you can make better business decisions. During this deep dive, Cat Huang and Tema Johnson, Looker customer engineers at Google Cloud, discussed the value of Looker for startup companies, including recommendations for how to choose a data warehouse complete with a product demo. The recording from this session includes the topics listed below, plus plenty of conversation infused in the presentation from open Q&A from community members present at the live event:(0:00) Welcome and introduction from C2C and the Google Startups Team (5:25) Looker (creating a data culture) vs. Data Studio (data visualizations) (9:00) Using Looker and Data Studio together for a complete, unified platform for self-service and centralized BI (10:10) Using looker with a data warehouse like BigQuery (13:15) Serverless big data analytics vs. traditional data warehouses (14:10) Integrated AI and ML services for data analytics (15:30) The power of Looker: in-database architecture, semantic modeling layer, and cloud native (21:05) Live demo: Looker (40:00) Closing comments and audience Q&AWatch the full recording below: Preview What’s NextJoin the Google Cloud Startups group to stay connected on events like this one, plus others we have coming up:
The centerpiece of C2C’s virtual Earth Day conference, Clean Clouds, Happy Earth, was a panel discussion on sustainability in EMEA featuring C2C and Google Cloud partners HCL and AMD and cosmetics superpower L’Oreal. Moderated by Ian Pattison, EMEA Head of Sustainability Practice at Google Cloud, the conversation lasted the better part of an hour and explored a range of strategies for enabling organizations to build and run sustainable technology on Google Cloud.According to Sanjay Singh, Executive VP of the Google Cloud Ecosystem Unit at HCL technologies, when advising customers across the value chain evaluating cloud services, Google Cloud becomes a natural choice because of its focus on sustainable goals. Connecting customers to Google Cloud is a key part of HCL’s broader program for maintaining sustainable business practices at every organizational level. “What you cannot measure, you cannot improve” says Singh, which is why HCL has created systems to measure every point of emission under their purview for carbon footprint impact. In alignment with Google Cloud’s commitment to run a carbon-free cloud platform by 2030, HCL plans to make its processes carbon neutral in the same timeframe.Suresh Andani, Senior Director of Cloud Vertical Marketing at AMD, serves on a task force focused on defining the company’s sustainability goals as an enterprise and as a vendor. As a vendor, AMD prioritizes helping customers migrate to the cloud itself as well as making its compute products (CPUS and GPUS) more energy efficient, which they plan to do by a factor of 30 by 2025. On the enterprise side, Andani says, AMD relies on partners and vendors, so making sure AMD as an organization is sustainable expands to its ecosystem of suppliers. One of the biggest challenges, he says, is to measure partners’ operations. This challenge falls to AMD’s corporate responsibility team.Health and beauty giant L’Oreal recently partnered with Google Cloud to run its beauty tech data engine. In the words of architect Antoine Castex, a C2C Team Lead in France, sustainability at L’Oreal is all about finding “the right solution for the right use case.” For Castex, this means prioritizing Software as a Service (SaaS) over Platform as a Service (PaaS), and only in the remotest cases using Infrastructure as a Service (IaaS). He is also emphatic about the importance of using serverless architecture and products like AppEngine, which only run when in use, rather than running and consuming energy 24/7.For Hervé Dumas, L’Oreal’s Sustainability IT Director, these solutions are part of what he calls “a strategic ambition,” which must be common across IT staff. Having IT staff dedicated to sustainability, he says, creates additional knowledge and enables necessary transformation of the way the company works. As Castex puts it, this transformation will come about when companies like L’Oreal are able to “change the brain of the people.”As Castex told C2C in a follow-up conversation after the event, the most encouraging takeaway from the panel for L’Oreal was the confirmation that other companies and tech players have “the same dream and ambition as us.” Watch a full recording of the conversation below, and check back to the C2C website over the next two weeks for more content produced exclusively for this community event. Also, if you’re based in EMEA and want to connect with other Google Cloud customers and partners in the C2C community, join us at one of our upcoming face-to-face events: Extra Credit:
On April 21, 2022, C2C hosted a live virtual Earth Day conference dedicated to all things sustainability in the cloud. Built around a live panel of C2C and Google Cloud customers and partners, the Clean Clouds, Happy Earth program also included a series of sessions featuring Google representatives, exploring different use cases and topics of high priority for all collaborators on the shared mission to bring about a clean and happy future for the cloud and the Earth. Over the next three weeks, we’ll be publishing these sessions on our website for you to view, share, and discuss with the rest of the C2C community. The video below is a presentation from Jenny Fernandez, Google Cloud’s Human Truths Lead in EMEA, about using data on human consumption patterns to inform more sustainable business and technical solutions: Check back to this page in the coming days for more of the content produced for this event, and please feel free to share your own thoughts here or directly to our community. Extra Credit:
People with COVID-19 are typically advised to self-isolate for two weeks, with some patients needing comprehensive home care. Mayo Clinic's Center for Connected Care originally designed its Remote Patient Monitoring Program to be used for patients with chronic conditions. Now it has adapted the model for patients with COVID-19.Quarantined Mayo Clinic patients participating in the Remote Patient Monitoring Program receive medical devices they use to screen and electronically transmit their vital signs. A team of remote nurses regularly monitors the patients’ health assessment data and contacts the patients if their conditions worsen, or if they may require support. How the Remote Patient Monitoring Program Works Mayo’s Remote Patient Monitoring Program serves two categories of patients: Patients who are at moderate to high risk for complications are given remote patient monitoring kits with blood pressure cuffs, thermometers, pulse oximeters, and a scale. Two to four times a day, patients use these devices to screen and process their vital signs to Mayo Clinic through the tablets they receive with their kits. Mayo’s Patient Monitoring nurses monitor these vital signs and call patients to ask if if they are experiencing COVID-19 symptoms such as vomiting, nausea, or diarrhea. Patients who are at low risk for complications monitor their conditions each day through the Mayo Clinic app. They receive a daily alert reminding them to provide their health assessments to their Mayo Patient Monitoring team. What Is Remote Monitoring? Remote patient monitoring allows physicians and healthcare facilities to track outpatient progress in real time. Caregivers also use this technology for geriatric wellness monitoring. Devices used for remote patient monitoring include wearable fitness trackers, smart watches, ECG monitors, blood pressure monitors, and glucose monitors for diabetes. Collected data is electronically transmitted to the patient’s doctors for assessment and recommendations. Benefits of this technology include: Remote care reduces burden for healthcare practitioners and healthcare organizations. Hospitals and clinics save on operational costs by reducing readmissions, staff engagement, and in-person visits. Remote patient devices enable early detection of deterioration and comorbidities, thereby reducing emergency visits, hospitalizations, and the duration of hospital stays. According to the Financial Times, remote patient technology could save the U.S. a total of $6 billion per year. A more recent scientific report calculated $361 in savings per patient per day, or around $13,713 in total savings per patient per year. Results Mayo Clinic’s Remote Patient Monitoring Program has reduced its caseload from 800 Covid patients to 350 patients with intensive needs. These patients were connected to 1-2 physicians per shift who monitored their symptoms and escalated care as needed.One such patient reported: “[This program] was our lifeline…. It just took some of that fear away, because we knew that there was somebody still there taking care of us with our vital signs. It motivated us to do better on getting well.” The Impact of Google Cloud Mayo Clinic uses Google Cloud and Google Health to positively transform patient and clinician experiences, improve diagnostics and patient outcomes, and conduct innovative clinical research. In addition to building its data platform on Google Cloud, Mayo uses Google Health to create machine-learning models for assessing symptoms of serious and complex diseases.
The field of data engineering is growing, and demand for certified data engineers is growing even more rapidly. The Google Cloud Data Engineer certification was the top-paying cloud certification of 2021, after the Cloud Architect certification held the distinction for the two preceding years. Given these stats, some questions are likely coming to mind for C2C community members looking into Google Cloud certifications: What’s driving the increase in interest in this field? Is data engineering the right choice for Google Cloud users looking for a new professional challenge? What is data engineering in the first place?Peak6 Technologies Principal Engineer Dan Sullivan recently joined C2C for a coffee chat to answer these questions and others, including:How is data engineering related to software engineering? What are DevOps, DataOps, and MLOps? What are the different compute and storage options on the Google Cloud Platform? What topics are covered in the Professional Data Engineer exam?Watch a full video of the event below to hear Sullivan’s answers and more questions from members of the C2C community: Extra Credit:
This session was led by Quantiphi, a foundational partner of C2C and a Google Partner that uses AI to solve the problems at the heart of businesses. Connect with them directly @Quantiphi in the C2C community.One of the best ways for enterprises across a broad range of business sectors to remain relevant is to use consumer behavior data in ways that will help their brands stand out from the competition. Using this data effectively and uniquely can help businesses improve the rate of customer acquisition, increase the ROI from marketing spends, and also ensure customer centricity and personalization. But what can we do to improve customer experiences by leveraging customer data, and how? To learn more, C2C sat down with Vijay Mannur, Customer and Marketing Analytics Practice Lead at Quantiphi, to discuss how to enhance consumer engagement and conversion using behavioral data. Questions answered will include:The recording from this Deep Dive includes:(1:55) Agenda overview and introduction to speakers (8:05) Marketing analytics How and why Quantiphi built a dedicated marketing and analytics team Options for marketing analytics from Google Cloud (14:50) Consumer data Third-party vs. first-party cookies and rich data quality Consumer Data Platform (CDP) vs. traditional Customer Relationship Management (CRM) How to build and upskill teams to use CDP effectively Using BigQuery and other Google Cloud analytics tools (32:25) Examples of customer stories using CDP How a French retailer centrally connected their consumer databases with custom pipelines from BigQuery How a bank optimized consumer segmentation and profiling using Vertex AI (39:00) Future of analytics The future of consumer data and trends nearing the end of their lifecycle Addressing privacy concerns using Google Cloud data warehousing and analytics solutions Ethical use of machine learning for consumer behavior Speakers featured in this Deep Dive Vijay Mannur Practice Head, Customer and Marketing Analytics, Quantiphi Vijay Mannur is a Practice Head at Quantiphi with 12+ years of experience in the field of Performance Marketing, Sales and Analytics. He leads the Customer and Marketing Analytics practice at Quantiphi, a leading Digital Transformation and AI solutions company. He has grown the practice at Quantiphi to encompass engineering teams building cutting edge solutions, delivery teams, and sales teams. He has delivered multiple large-scale digital transformation solutions to marketing teams of large Retail and FSI clients. Prior to Quantiphi, Vijay worked for companies like Media.net, Idea Cellular, and NEC Corporation. Daniel Lees Staff Partner Engineer, Google Cloud A Cloud Architect at Google, Daniel Lees was a Principal Architect in Financial Services Select helping Google’s most valued clients build in Google Cloud Platform before joining the Partner Engineering team in support of Google’s most important partners. He has extensive expertise in defined best practices, blueprints, security and compliance standards, and evangelism of reusable assets for cloud deployment in CICD pipelines with IoC working on both cloud native and hybrid application modernization. Before Google, he had 20 years of experience at HSBC Bank where he was the Chief Technical Architect for AWS Cloud globally, leading a small team of SME cloud architects. Other ResourcesRedefine customer and marketing analytics Google Cloud Marketing analytics & AI solutions Responsible AI practices
On January 11, 2022, C2C members @antoine.castex and @guillaume blaquiere hosted a powerful session for France and beyond in the cloud space. C2C Connect: France sessions intend to bring together a community of cloud experts and customers to connect, learn, and shape the future of cloud. 60 Minutes Summed Up in 60 Seconds Yuri Grinshteyn, Customer SRE at Google Cloud, was the guest of the session. Also known as “Stack Doctor” on YouTube, Grinshteyn advocates the best way to monitor, observe and follow the SRE best practices as learnt by Google in their own service SRE teams. Grinshteyn explained the difference between monitoring and observability: Monitoring is “only” the data about a service, a resource. Observability is the behavior of the service metrics through time. To observe data, you need different data sources; metrics, of course, but also logs and traces. There are several tools available, but the purpose of each is observability: FluentD, Open Sensus, Prometheus, Graphana, etc. All are open-source, portable, and compliant with Cloud Operations. The overhead of instrumented code is quite invisible, and the provided metrics are much more important than the few CPU cycles lost because of it. Microservices and monoliths should use trace instrumentation. Even a monolith never works alone: it uses Google Cloud Services, APIs, Databases, etc. Trace allows us to understand North-South and East-West traffic. Get in on the Monitoring and Observability Conversation! Despite its 30-minute time limit, this conversation didn’t stop. Monitoring and observability is a hot topic, and it certainly kept everyone’s attention. The group spent time on monitoring, logging, error budget, SRE, and other topics such as: Cloud Operations Managed Services for Prometheus Cloud Monitoring Members also shared likes and dislikes. For example, one guest, Mehdi, “found it unfortunate not to have out of the box metrics on GKE to monitor golden signals,” and said “it’s difficult to convince ops to install Istio just for observability.” Preview What's Next Two upcoming sessions will cover topics that came up but didn’t make it to the discussion floor: If either of these events interests you, be sure to sign up to get in touch with the group! Extra Credit Looking for more Google Cloud products news and resources? We got you. The following links were shared with attendees and are now available to you! Video of the session Cloud Monitoring Managed Services for Prometheus Sre.google website SRE books Stack Doctor Youtube playlist
The effort to combat climate change is such a major undertaking that no metaphor does it justice. It will take more than “all hands on deck.” We need to be more than “on board.” Every one of us has a crucial role to play. That’s why the data we have must be available to the entire public, not just governments and corporations.In October 2021, Google Cloud established partnerships with five companies engaged in environmental data collection efforts: CARTO, Climate Engine, Geotab, Egis, and Planet Labs. These companies are working with Google to make their datasets available globally on Google Cloud. As a 2020 Google Cloud Partner of the Year and a company with a stated commitment to sustainability, C2C foundational partner SpringML is excited to raise awareness of this initiative.In this fireside chat, Lizna Bandeali and SpringML’s Director of Google Cloud Services Masaf Dawood explore the background and the implications of this recent effort. Key points discussed include ease, transparency, and accessibility of data, and a focus on actionable insights. With the datasets available and Google Cloud Platform tools like BigQuery, organizations and individuals working in environmental science, agriculture, food production, and related fields can make informed predictions about everything from weather patterns to soil quality. These organizations and individuals can use these predictions to plan future resource use around vital sustainability guidelines. Watch the full video below:Are you an individual or a decision-maker at an organization pursuing sustainability? What are you doing to take up this effort? Contact us on our platform and tell us your story!
The Google Cloud Platform hosts all kinds of tools for data storage and management, but two of the most versatile and popular are Bigtable and BigQuery. While each service is a database, the key difference between the two lies in their names. Bigtable (BT) is literally a “big table” that scales to petabytes if not terabytes for storing and collecting your data. BigQuery (BQ), on the other hand, conducts a “big query” into your massive troves of data. Each database has other unique attributes that define when and how to use it. These topics, along with use cases, case stories, and costs associated with each product, are covered in the following sections. Bigtable Bigtable, Google Cloud’s fully-managed database for hefty analytical and operational workloads, powers major Google products like Google Search, Google Maps, and Gmail. The database supports high read/write per second speed, processes reads/writes at ultra-low latency, and scales to billions of rows and thousands of columns for massive troves of data. Bigtable is ideal for Cloud data visualization products, such as BigQuery, DataFlow, and DataProc. It integrates well with Big Data tools such as Hadoop, DataFlow, Beam, and Apache HBase. Bigtable Use CasesBigtable is best-used for instances with lots of data, such as the following: Time series data, e.g., CPU usage over time for multiple servers. Financial data, e.g., currency exchange rates. Marketing data, like customers’ purchase histories and preferences. Internet of things data, such as usage reports from home appliances. Fraud detection, i.e. detecting fraud in real time on ongoing transactions. Product recommendation engines to handle thousands of personalized recommendations. BigQuery BigQuery is Google Cloud’s serverless fully-managed service that helps you ingest, stream, and analyze massive troves of information in seconds. In contrast to Bigtable, BigQuery is a query engine that helps you import and then analyze your data.Since BigQuery uses SQL (Structured Query Language), this database is ideal for Amazon Redshift, which uses SQL to analyze structured and semi-structured data across data warehouses, operational databases, and data lakes. BigQuery Use CasesBigQuery is commonly used for instances that include: Real-time fraud detection; BQ ingests and analyzes massive amounts of data in real-time to identify or prevent unauthorized financial activity. Real-time analytics; BQ is immensely useful for businesses or organizations that need to analyze their latest business data. Log analysis; BQ reviews, interprets, and understands computer-generated log files. Complex data pipeline processing; BQ manages and interprets the steps of one or more complex data pipelines generated by source systems or applications. Similarities Between Bigtable and BigQuery Each database boasts ultra-low latency on the order of single-digit microseconds, high-performance and speed on the order of 10,000 rows per second, and powerful scalability that enables you to scale (or descale) for additional storage capacity. Both are end-to-end managed and thoroughly secure as they encrypt at-rest and transit data. Differences Between Bigtable and BigQuery While Bigtable collates and manages your data, BigQuery collates and analyzes those troves of data.Bigtable resembles an Online Transaction Processing (OLTP) tool, where you can execute a number of transactions occurring concurrently—such as online banking, shopping, order entries, or text messages. BigQuery, in contrast, is ideal for OLAP (Online Analytical Processing) — for creating analytical business reports or dashboards. In short, for anything related to business analysis, such as for scrolling through last year’s logs to see how to improve business. While Bigtable is NoSQL — mandatory for its flexible database — BigQuery uses SQL, making it ideal for performing complex queries on heavy-duty transactions. Don’t expect BigQuery to be used as a regular relational database or for CRUD (to Create, Read, Update, and Delete data). It’s immutable, which means its information is encoded so that it can’t be edited or removed. Case Studies Companies use Bigtable for structuring and managing their massive troves of data,while they use BigQuery for mining insight from these troves of data. Below are a few examples of how businesses have used each in practice: Bigtable Digital fraud detection and payment solution company Ravelin uses Bigtable to store and query 1.2 billion transactions of more than 230 million active users. AdTech provider OpenX uses Bigtable to serve more than 30,000 brands, more than 1,200 websites, and more than 2,000 premium mobile apps, and processes more than 150 billion ad requests per day. Dow Jones DNA uses Bigtable for fast, robust storage of key events that the company has documented in over 30 years of news content. BigQuery UPS uses BigQuery to achieve precise package volume forecasting for the company. Major League Baseball is expanding its fan base with highly-personalized immersive experiences. They analyze their marketing using BigQuery. The Home Depot uses BigQuery to manage customer service and keep 50,000 items routinely stocked across 2,000 stores. Costs When using BigQuery, you pay for storage (based on how much data you store). There are two storage rates: active storage ($0.020 per GB), or long-term storage ($0.010 per GB). With both, the first ten GB are free each month. You also pay for processing queries. Query costs are either on-demand (i.e., charged by the amount of data processed per query), or flat-rate.BigQuery also charges for certain other operations, such as streaming results and the use of its Storage API. Loading and exporting data is free. For details, see BigQuery pricing. Using Bigtable, you pay for storage and bandwidth. Here’s all you need to know on Bigtable pricing across countries.If you’re ready to start using or testing either product for a current or upcoming project, you can create a Bigtable instance using Cloud Console’s project selector page, or Cloud’s Bigtable Admin API. BigQuery is accessible via Google Cloud Console, The BigQuery REST API, or an external tool such as a Jupyter notebook or business intelligence platform. Extra Credit:
Machine Learning (ML) is a major solution business and technical leaders can use to drive innovation and meet operational challenges. For managers pursuing specific organizational goals, ML is not just a tool: it’s a mindset. C2C’s community members and partners are dynamic thinkers; choosing the right products for their major projects requires balancing concrete goals with the flexibility to ask questions and adapt. With these considerations in mind, C2C recently invited Google Cloud Customer Engineer KC Ayyagari to host a C2C Deep Dive on The ML Mindset for Managers.Ayyagari started the session by asking attendees to switch on their cameras and then ran a sentiment analysis of their faces in Vision API:After giving some background on basic linguistic principles of ML, Ayyagari demonstrated an AI trained to play Atari Breakout via neural networks and deep reinforcement learning:To demonstrate how mapping applications can use ML to rank locations according to customer priority, Ayyagari asked the attendees for considerations they might take into account when deciding between multiple nearby coffee shops to visit:As a lead-in to his talking points about the ML mindset for managers, Ayyagari asked attendees for reasons they would choose to invest in a hypothetical startup he founded versus one founded by Google’s Madison Jenkins. He used the responses as a segue into framing the ML mindset in the terms of the scientific method. Startup management should start with a research goal, he explained, and ML products and functions should be means to testing that hypothesis and generating insights to confirm it:Before outlining a case study of using ML to predict weather patterns, Ayyagari asked attendees what kinds of data would be necessary to use ML to chart flight paths based on safe weather. Guest Jan Strzeiecki offered an anecdote about the flight planning modus operandi of different airports. Ayyagari provided a unique answer: analyzing cloud types based on those associated with dangerous weather events.The theme of Ayyagari’s presentation was thinking actively about ML: in every segment, he brought attendees out of their comfort zones to get them to brainstorm, just like an ML engineer will prompt it’s machines to synthesize new data and learn new lessons. ML is a mindset for this simple reason: machines learn just like we do, so in order to use them to meet our goals, we have to think and learn along with them.Are you a manager at an organization building or training new ML models? Do any of the best practices Ayyagari brought up resonate with you? Drop us a line and let us know! Extra Credit:
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK