A guest post by Your Sidekicks AG.
With Google Analytics 4 (GA4), Google is creating a future-oriented solution to compensate for the ever-increasing effect of data protection measures. At the end of 2023, the old Universal Analytics features will be deleted. Find out how GA4 works, what the tool can do and how to find your way around.
Summary – Why the change and what does it mean for you?
The world of data collection is changing. An ever-growing need for data protection and privacy online is leading to new data protection laws such as the GDPR in the EU and the new GDPR in Switzerland (from mid-2023).
Big players like Google, Apple, Meta and Microsoft are reacting to the growing need for anonymity with corresponding software updates, such as Apple’s iOS 14 update or ad blockers activated by default in browsers.
We don't need to persuade you about the importance of data. To say data is important for business today would be like suggesting the sun is important for daylight. It's a given. In fact, companies driven by data are 58% more likely to surpass revenue goals than those that are not focused on data.
But data in isolation isn't enough.
Despite the billions spent every year in the pursuit of becoming more data-driven, 8% of businesses successfully scale their analytics to get value from their data.
Just as sunlight isn't enough to grow a garden — it needs other components such as water, seeds, care, and oversight. Those components create a collective environment of collaboration working towards a unified goal — growth.
In the case of data, this means fostering a data culture, or rather, a culture of data.
Thanks to third-party cookies, advertisers have dined out on an endless buffet of rich audience data for years. Consumers could be targeted with virtually unrestricted pinpoint accuracy, providing marketers with a powerful way to grow their business.
However, as of late 2023, third-party cookies will no longer be supported by Google Chrome, which has left advertisers scrambling to find new ways to target their audiences.
Necessity is the mother of invention of course, so advertisers need not panic with the advent of the data clean room — a new way to target consumers that does not rely on cookies or breach user privacy concerns.
Let's take a look at what the whole "cookieless" thing is about, why it matters, and how the data clean room is the answer digital marketers are looking for.
This is Part 3 of our mini series 'Master Data Management — What It Is and Why You Need It'
The ultimate goal of MDM is to have an SSOT so that your data is in one place, and a golden record so that each time you access it, you know that's the latest and most accurate version of your organization's data.
There are many ways to achieve an SSOT and a golden record, but we've found the following to be the best approach.
This is Part 2 of our mini series 'Master Data Management — What It Is and Why You Need It'
We are swimming in data, and so it's easy to feel like we are data-driven and analytical in our decision-making. In reality, we are at the mercy of multiple data systems with their own goals that may not necessarily be aligned with your organization's.
MDM allows organizations to be truly data-driven by being in complete control of their data. This control comes from enabling anyone in the organization to get a holistic view of the most up-to-date data at all times.
This is Part 1 of our mini series 'Master Data Management — What It Is and Why You Need It'
As important as master data management (MDM) is to your organization, we admit it's not the most approachable topic — it's full of complexities and in some cases, jargon that is used simply to make us IT types feel a bit smart.
At its core though, MDM is fundamental to any organization that wants to run smoothly today and plan for tomorrow by organizing all of its data neatly and uniformly in one place.
We're here to help de-mystify MDM for you — so here's a quick overview of what MDM is, why it should be important to you, and the ultimate goal of getting it right.
Big Data - a term that has become very trendy during previous years. Everyone who works with data in some way knows the term, and yet there are still uncertainties about what Big Data actually is and how we should cope with it.
In this blog post, we will elaborate on the reasons why the term “Big Data” not only refers to increased volumes of data but its scope. We will therefore not only address its definition but also its potential, challenges, and the reason why it will fundamentally change the way we see our world.
What is Big Data?
In common terms, Big Data is defined by using the famous three V’s:
The three V’s can be extended with ‘Veracity’ and ‘Variability’, while the former refers to the varying quality of data and the latter describes the irregular, unpredictable streams of incoming data.
What’s behind the hype?
The hype behind Big Data can be explained by looking at three main dimensions; the proceeding disaggregation of data, the increased computer processing capacity, and the more powerful data analysis methods available.
Author: Benjamin Saladin
We all have probably been in discussions about automation of repetitive tasks and very likely the outcome was: “This seems too expensive to automate.” But this decision making process is most likely based on experience and gut feeling. Don’t get me wrong, this can be a valid approach if there is no underlying data or models that would help give you a better outcome on your ROI (return on investment).
On the other side, automation is often a valuable approach. Using the dashboard below, you can check out the impact on ROI and the break-even date from which your investment starts to pay off. The tool helps you to evaluate how much time and money you could save by automating repetitive tasks. Give it a try - automating will be worth it, believe me.
Managing the pipeline from raw data ingestion, storage on a data platform to data transformation plays a crucial role in each data-driven company. This pipeline, also called Extract, Load & Transform (ELT) can be heavily outsourced by using a modern data stack. This is where Fivetran, Snowflake, and dbt appear on the stage.
Two unicorns and a rising star will make our daily lives in analytics exciting, fast and more reliable. Snowflake is a fantastic offer for a data warehouse service either hosted on AWS or Azure. This service charges for computing and storage only. No usage, no bills. As with Fivetran, this offering is based on credits that depend on the tier and the location of your cloud offering your Snowflake instance is running on. We are currently working on the maintenance of our Snowflake data warehouse automated by Terraform - exciting times I would say.
To feed the Snowflake Data Warehouse we use Fivetran, of whom we are the first official partner in Switzerland. Amazing people in Munich to say the least. We are looking forward to visit them as soon as these not fun times are over. Fivetran essentially is a data pipeline service that takes over the heavy lifting for us - experts say the ingestion and maintenance can take up about 80% of an analytics platform, before any benefits in the form of actual analytics can be presented to the customer. If you are interested in their offering, let us know and we can support you on that journey.
dbt (data build tool) takes over the last part in ELT - Extract, Load & Transform. This application is a gem, a rising star in the analytics community and getting a lot of attention from Fivetran. The best is, this is open source and available to use for anyone. The community is highly active and very helpful in your modelling journey. On our own journey, we aim to build a product, based on these tools and other, to provide a comprehensive analytics suite to our customers.
We are currently working in a proof of concept mode working together with innovative companies that implement future-proof technology while applying fundamentally robust pattern to build upon. Very exciting times ahead, of course also for our Swiss market since Azure has since 2020 data centres in Switzerland which takes off concerns holding certain industries back from moving towards cloud solutions. Have you got experience with dbt or any kind of dbt offering in Switzerland?
Starting in May 2019 I got an assignment in an international institution in Basel. They were hiring tableau devs and I wanted to focus my career more towards business intelligence. Since I think people should interpret with their domain knowledge, not wasting time struggling to get high quality data sets together. In the interviews I was warned that there is certain limitations with tableau working with cubes.
And my dear, yes there are. You cannot create calculated fields with dimensions. It is hard to browse through the underlaying data in the cube since tableau can only show field names and other attributes of these fields except the content. That is why we use Excel or SSMS (SQL Server Management Studio).
What's new then with a cube? Although you are limited with the calculated fields, you are able to compensate with a feature called calculated member. This MDX operated calculated field is more powerful and can also work with dimensions. This allows you to group dimensions or rename/give them an alias. Which is not working in a cube environment.
But there is hope, the experience will not be great but there is some tips and tricks I will provide in this blog. Soon more to come.