Configure highly available data and real-time analytics to optimize performance, improve decision-making, and unlock new value.
- 145 Topics
- 147 Replies
In this video and sketchnote I summarize the 6 steps of data science and point you to the tools in Google Cloud that help you achieve the outcome of these steps in data science. This will give you an overview of data science on Google Cloud in 10 minutes. We cover the six steps of data science:1. Data Engineering2. Data Analysis3. Model Development4. ML Engineering5. Insights activation More sketches here
There is a huge domain of using optimization to solve business problems. My company is using optimization including https://developers.google.com/optimization to solver broadband network design and funding. I’m very interested to learn if anyone else is using it and how.
I created connected sheets with Big Query. But when building the retreive we can write any request to the database. In my point of view, that is creating a security breach.How can I manage a connexion to a single table in BigQuery ? Thank you for your answers
Hi all,Next friday our friends from SpringML will be talking about “Addressing the Supply Chain Crisis with AI”. During the last two years since the world first fell into the COVID-19 pandemic, we have seen a number of disturbances throughout all of the industries. One in particular was the dramatic impact on the supply chain worldwide and the effects on all consumers. Join SpringML as we discuss how combining artificial intelligence with a number of additional technologies and innovations, can be the answer to addressing consumer demand, labor shortage, and preventative measures.I hope to see you all there. It will be time for questions and comments!
Hi All,I have a bigquery table which stores data from Firestore. Firestore stores all the timestamps in UTC timezone. Is there a best practice how I can display those timestamps in user timezone (not in UTC timezone) in Google Data Studio Dashboards?I would like to avoid converting the timestamp adding another field such as CreatedDateNY (using DateDiff). regards,Manish
We will start a new year and a new C2C Connect Coffee Chat series for Data and Analytics with it. Join this recurrent event once a week to meet other people interested in Data and Analytics on Google Cloud.What is the C2C Connect: Data and Analytics Coffee Chat?These Coffee Chats are informal meetings designed to bring Google Cloud customers a place to meet, casually interact, and share knowledge with other users.How many Chats do I need to attend?You can assist as many Coffee Chats as you want. Some members use it sporadically to show and exchange information with others, and other members use it as a regular place to gain and share knowledge and network.How can I get the most of these events?There is always something to comment or talk about in the Coffee Chats, but coming prepared with something to share is the best strategy. Use this forum (look for the new topic button on the top-right) to post doubts, questions, or share a story in advance, so other people know how to help before
Hi all, I am a little bit confused and I would like to have your clarification. About dataflow pricingWhen working on dataflow, we have to specify a GCS bucket for temp_location. So is the use of this bucket subject to a google cloud storage pricing?About the Bigquery pricingWhen I do bacth load from dataflow to bigquery, I have to specify a bucket on google cloud storage. Do I also have to pay for the usage of this bucket? And after the load to bigquery, will the data be deleted from this bucket?
Think outside the boxMy name is Abdul Wahab Amir. I am in 7th semester BSSE student at Usman Institute of Technology. Here is my story that I have planned after analyzing the current situation in my area.I do not have found in my area that any person is giving any special support to any orphans and poor’s. if anyone do this, the first thing he try to expose this help to the internet. so apart from this I have decided a good plan for young people who are searching for jobs near to their areas and their hardworking are going to vain. as we know digital technologies have grown faster than any innovation in our history, therefore.So I have decided to serve the people through digitalization by creating a website that takes concerned User Basic information in his native language and rapidly make a resume or cv and according to his current location forwarded it to nearer organizations and companies emails in a well-mannered format generated by the website and as companies reply to his particu
Twitter’s Architecture DiagramLearn how Twitter's data engineers built their social media platform on their Data Center services and Google Cloud using products like Dataflow, Pub/Sub, BigTable, and BigQuery👉https://goo.gle/3qGyM5oWhat (if anything) surprises you about their data diagram? Have any questions about this story? Tell/ask us in the comments!
Going through Google Data Catalog, it seems to be a very effective in maintaining the metadata across any organization sources without physically bringing them. However, if one has to build an semantic on the information gather by Data Catalog, how could we do that.E.g. Building relation between entities of different sources (if there are)Understand the gaps in information/relationActivate the metadata using something like Knowledge Graph, Semantic modeling etc.Can we access Google Catalog data from a third party tool to build a graph model?
Hello community, I work in a Product Owner role for data science service solutions and we started to use the GCP and the GA360 BigQuery export quite heavily. We have just noticed that some of your past intraday tables are not are partly not cleared, i.e. deleted. Does this sound familiar to anyone and is there a solution for it? Best wishes, Matthias
I came across this application called Rowy, It’s a software built in GCP to manage Firestore data with a spreadsheet-like UI, It allows to write cloud functions as well that react to changes in database fields.Seems quite an interesting project to me, It is open-source and allows you to store the full software in your own GCP Project so technically you are the owner of the data.
Hey! Good Morning Data/Analytics Group! Wake up with C2C is a new offering where community members can spend 30 minutes every Friday at 10 am CST discussing recent Data/Analytics headlines and share their current projects. You supply the coffee, we’ll bring the conversation. Please register for the event, you’ll find the joining information in the invite
when I use data transfer from S3 to big query it's stores data as duplicate as I try to store last 7 days data. But if I use same transfer from sa360 to big query it updates existing records and appends new one. is there a way i can enable same feature for S3 transfer. or any workaround any sample pub sub model that can remove duplicates.
1 List of acronyms and terms AWS—Amazon Web ServicesGCP—Google Cloud PlatformGCS—Google Cloud StorageS3 bucket—public cloud storage resource available in Amazon Web Services 2 IntroductionBigQuery is a serverless and highly scalable data warehouse. It gives a possibility to query petabytes of data and gain insights within minutes as well as provides an ecosystem for doing different kinds of analytics such as business intelligence, geospatial analysis, machine learning, and AI.BigQuery Omni is a multi-cloud analytics solution that brings capabilities of querying data across other cloud computing platforms such as AWS and Azure. At SoftServe, we’ve been working with BigQuery Omni since its launch and found out that it can be used to simplify data migration challenges. 3 Data reconciliationDuring the migration process, it may be difficult to avoid different kinds of data corruption. Business metrics may lose their precision, records may be truncated, and some data may just be lost. To be
Do you know how to create a fully functional automated initial ETL + Analytics workload?Did you create an automated Analytics workload end-to-end using Cloud Functions, Data Fusion, BigQuery and Data Studio?Image source: from DV’s article on Medium In this article Daniel Villegas is using baby steps to explain how we can do it. Read the article and then all you have to do is keep feeding data into GCS and whenever you need it, you can have Data Studio send scheduled reports with up-to-date data. QUIZWhich Cloud Data Fusion edition offers FREE hours per month per account and how many are there? The author uses BigQuery. For building Replication jobs which BigQuery pricing model is recommended?
Login to the community
Social LoginLogin With Your C2C Credentials
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.