Analytics Archives - insightland https://insightland.org/blog/category/analytics/ We are an international SEO company ⚡ Our specialists create and implement SEO strategies to bring traffic to clients' websites around the world ✔️ Check! ✔️ Thu, 10 Apr 2025 13:46:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://insightland.org/app/uploads/2024/12/cropped-Logo-tlo-z-ramka-Insightland-32x32.png Analytics Archives - insightland https://insightland.org/blog/category/analytics/ 32 32 Measuring data from generative chatbots https://insightland.org/blog/measuring-data-from-generative-chatbots/ Thu, 10 Apr 2025 10:52:00 +0000 https://insightland.org/?p=4470 Generative Chatbots like Perplexity and ChatGPT successfully display links to specific products, categories, or forum articles – either in their responses or as sources. The need to appear in the answer provided by a chatbot, to obtain additional organic traffic, is the first step. 

The post Measuring data from generative chatbots appeared first on insightland.

]]>
p.is-style-uppercase{text-transform:uppercase}p.is-style-preheading{margin-bottom:var(--wp--preset--spacing--small);font-size:clamp(.875rem,2vw,1.125rem);font-weight:500;line-height:clamp(1.05rem,2vw,1.35rem);color:var(--wp--preset--color--dark-800);letter-spacing:.05rem;text-transform:uppercase}p.is-style-subheading{margin-bottom:1.5rem;font-weight:800;color:var(--wp--preset--color--primary);letter-spacing:0.075rem;text-transform:uppercase;font-size:1rem;line-height:1.5}p.is-style-leadparagraph{font-weight:700;font-size:1rem;line-height:1.5}@media(min-width: 768px){p.is-style-leadparagraph{font-size:1.125rem;line-height:1.5}}.block-theme--dark p.is-style-preheading{color:var(--wp--preset--color--light)}

After all, their popularity is growing. At the end of 2024, only ChatGPT was receiving 3.8 billion visits per month. And, according to the research by Gartner, by the end of 2028 even 28% more users than today will pick generative AI platforms instead of traditional web search as the first step of their online research. 

Making an effort to be present in the AI chat responses is crucial. But the second and most important step is measuring the effectiveness of the activities you’ve performed to get there. This is significant because it allows you to assess ROAS and the validity of investing in this new traffic channel. 

The problem is, in Google Search Console or Bing Webmaster Tools (We’ll come back to this later ;)) we cannot find accurate data on queries appearing in the chats, which is what we got used to from the perspective of classic SEO. 

Therefore, it is necessary to develop methods that will allow us to measure how often, for which phrases, and with what impact on business a given brand is displayed in ChatGPT, Perplexity or Copilot.

Here’s more about some of the available measuring tools and methods, and how to combine the data we get from them to obtain some actionable insights.

Bing Webmaster Tools

Bing tells us that the data shown in Search Performance does not apply to generative chats. 

At the same time, the “Internet and chat” item is visible in the data filtration:

Our research indicates, however, that data from chats, although extremely anonymized, can be displayed in Bing Webmaster Tools.

Here’s an example of an exact query tested by us for one of our clients, and the results we got:

The number of views isn’t impressive, but it’s there. In our tests, though, the link to the client’s website was displayed many more times. And we were actually clicking on the link, which was not captured by the tool at all. Nevertheless, this is an indication that some data from chats appear in Bing Webmaster Tools, but in such a limited form that makes it unsuitable for efficiency analysis.

Google Analytics 4 (GA4)

Google Analytics 4 (GA4) and Looker Studio (formerly Google Data Studio) come to the rescue, allowing for deep analysis and optimization of activities in the area of AI Search Optimization

We can perform research on the effectiveness of a given brand’s views in the generative chatbots thanks to specific sources/mediums added to the URL data mentioned during chat conversations. Even if the data may be not 100% accurate, due to the CookieConsent, – it is still giving us an indication of i.e. growing share in organic traffic for the website. 

Example from ChatGPT:

Alternatively, ChatGPT and Perplexity send data about views, e.g. using the mentioned referral attributes.

*Bing is an exception in this regard (more about this in the section on log analysis below). 

Free Looker Studio report by Insightland

At Insightland, we work with data every day. Measuring the effectiveness of our activities is crucial for us. That’s why we have prepared a free Looker Studio report with data from AI Search. 

Here’s a sample view from the report:

What do you get from chatbot and AI Search data analysis in GA4?

01 Understanding traffic sources

GA4 analysis allows you to identify which AI Search platforms (ChatGPT, Perplexity, Gemini, etc.) are driving traffic to your website. According to research by Ahrefs, the largest AI traffic comes from ChatGPT, Perplexity and Gemini. Awareness of these proportions allows you to adapt your optimization strategy to the most popular sources.

02 Assessment of impact on sessions

Monitoring sessions in GA4 allows you to assess how traffic from AI Search affects user engagement, time spent on the site and bounce rate. This, in turn, allows you to assess the quality of traffic and its potential impact on conversions.

03 Measuring the impact on revenue

Integrating data from GA4 with revenue data allows you to determine the extent to which traffic from AI Search translates into real profits. This is crucial for assessing the ROI in AI Search optimization.

How to use Looker Studio and GA4 reports to measure chatbot data?

Download the free report

We have prepared a free Looker Studio report template that helps you analyze the impact of generative chatbots on your business.

Analyze traffic sources

The report allows you to identify traffic sources, including AI Search platforms, that generate the most sessions on your website.

Measure the impact on revenue

Integrate revenue data to assess the extent to which traffic from AI Search translates into real profits for your business.

Track changes over time

Analyze changes in key metrics over time to assess the effectiveness of your optimization efforts and identify trends.

Sample metrics to track

  • AI Search sessions: Number of sessions that originate from AI Search platforms (ChatGPT, Perplexity, Gemini, etc.).
  • Conversion rate with AI Search: The percentage of AI Search sessions that resulted in a conversion (e.g. purchase, registration, form completion).
  • Revenue from AI Search: Revenue generated by sessions that originate from AI Search platforms.

In the era of growing popularity of AI chatbots, monitoring and analyzing the traffic they generate becomes crucial. Standard tools such as Google Analytics 4 (GA4) may not provide the full picture, though. Therefore, it is worth using advanced methods, such as server log analysis.

Server log analysis: the key to the whole picture

The available data platforms currently don’t show us the whole spectrum of the data about traffic coming for AI search. Analysing the server logs may help us discover some interesting details necessary to start a valuable AI search traffic analysis.

What are server logs?

Server logs are records of all activities and events that occur on the server. They contain information about IP addresses, referrer, visit time, HTTP code and User-Agent. Thanks to them, we can thoroughly monitor traffic, diagnose problems, and increase website security.

How to use server logs to analyze traffic from ChatGPT?

Identifying OpenAI bots: OpenAI uses several crawlers with distinct functions that identify themselves through User-Agent. In the context of measuring visibility in ChatGPT, it is crucial to analyze logs for the presence of the ChatGPT-User bot.-User bot.

ChatGPT-User: Querying this bot in the server logs for a specific URL means that the link to that URL was displayed as a source in the ChatGPT response.

Calculating the CTR rate: Log analysis allows you to calculate the approximate CTR (Click-Through Rate) for links displayed in ChatGPT.

OpenAI bots and their User-Agent

OAI-SearchBot: Crawls pages to SearchGPT search results.

GPTBot: Searches content to train AI models. Blocking GPTBot in the robots.txt file prevents the website content from being used for AI training.

ChatGPT-User: Visits pages to help answer the GPT Chat and include a link to the source.

Example

The Samoseo.pl article (in Polish) presents a case study of server log analysis for various websites:

  • Site A: No presence of OpenAI bots in the logs. After the ChatGPT query test, ChatGPT-User performed 1 GET query to the home page of the site that was referenced in the source.
  • Site B: 74 ChatGPT-User bot queries over 3 months. The proportion of views on ChatGPT to views on Google was 47 to 34,293 (approx. 729 times Google’s advantage).
  • Site C: 15,342 ChatGPT-User bot queries to 466 unique URLs over 3 months.

Comments:

  • One URL may be specified as a source several times in one ChatGPT response, but this only corresponds to one request to the server.
  • Log analysis will not indicate whether the link to the website appeared directly in the response or only after clicking on “Sources”.

The logs may show what GA4 doesn’t

For instance, GA4 may not show the correct scale of traffic from chatbots, e.g. due to Consent Mode. Additionally, GA4 only shows “visits” to the site (sessions and users), not “views”, i.e. moments when a link to the site appears in the ChatGPT response. That’s why using server logs could be a game changer. 

Another example, described in the Ahrefs study, shows that traffic from Copilot is recognized as Direct.

From our own experience, reports for our clients do not show the source as Bing other than standard organic/paid traffic or referral traffic from  cn.bing.com. Below you’ll find the data from the last 28 days for one of our clients:

Conclusion

As AI chatbots continue to gain traction, the traffic they generate is becoming an increasingly valuable source to understand and leverage. While current measurement tools may fall short in precision, a thoughtful combination of data from the currently available tools can still provide useful insights.

For those ready to go deeper, more advanced techniques like server log analysis offer a clearer view – but they do require a higher level of technical expertise.

If you’re looking to take that first step toward understanding and optimizing your AI chatbot-driven traffic, at Insightland we can help bridge the gap and get you started with the AI search analysis.

The post Measuring data from generative chatbots appeared first on insightland.

]]>
Data-Driven Decision-Making https://insightland.org/blog/data-driven-decision-making/ Fri, 15 Nov 2024 11:25:37 +0000 https://insightland.org/?p=3651 In today’s fast-paced business environment, the value of data can’t be overstated. Every successful decision starts with insights and numbers—not assumptions. McKinsey reports that data-driven companies are:

The post Data-Driven Decision-Making appeared first on insightland.

]]>

Introduction

McKinsey reports that data-driven companies are:

  • 23 times more likely to attract new customers,
  • 6 times more likely to retain them, and
  • 19 times more likely to boost profitability.

The advantages are clear, but what does it mean to be genuinely data-driven?

What is Data-Driven Decision-Making (DDDM)?

Data-driven decision-making may sound like a buzzword, but at its core, it’s simple: it’s about making business decisions rooted in concrete insights, not guesses or gut feelings. When companies harness data to shape their strategy, they gain clarity, direction, and a competitive edge. With solid market research and actionable insights, organizations can make confident choices in marketing, product development, positioning, and pricing.

How Does DDDM Benefit Businesses?

Data-driven decision-making is an approach that brings data to the center of each choice, enabling companies to:

  • Optimize operations through efficient resource allocation,
  • Enhance customer experience by predicting needs and personalizing interactions, and
  • Increase visibility by making targeted, insight-backed marketing choices.

Real Examples of Data-Driven Wins!

Summary

Data holds endless possibilities, but only when it’s guided by insight. At Insightland, we help organizations turn numbers into clarity and strategy. Ready to discover where your data can lead you?

CONTACT

Unlock better insights

Krzysztof Surowiecki
Senior Manager Commercial Analytics
Kazik Surała
Chief BI Officer

Book free consultation

The post Data-Driven Decision-Making appeared first on insightland.

]]>
New Features in BigQuery https://insightland.org/blog/new-features-in-bigquery/ Wed, 11 Sep 2024 14:00:03 +0000 https://ins-new.stagenv.dev/?p=3049 Google BigQuery has introduced new features that enhance data analysis, real-time processing, and integration with other tools. These improvements facilitate advanced analytics and process automation, enabling businesses to make quicker decisions.

The post New Features in BigQuery appeared first on insightland.

]]>

Introduction

Google BigQuery has introduced new features that enhance data analysis, real-time processing, and integration with other tools. These improvements facilitate advanced analytics and process automation, enabling businesses to make quicker decisions.


New Features

  • ScaNN (Scalable Nearest Neighbors): Enables fast searching of similar data in large datasets, useful in ML applications like recommendations and image analysis. It allows BigQuery to handle large volumes of queries efficiently.
  • Continuous Queries: Supports real-time data processing, providing instant access to the latest information. This helps detect fraud quickly and supports reverse ETL, which sends data to other systems like Apache Kafka without batch processing.
  • Data Insights: Offers interactive exploration, automatic summaries, and visualizations of data in BigQuery. Users can easily understand data structure and quality, identify trends, and gain insights without needing advanced technical skills.

Summary

With these new BigQuery features, companies can process data faster and make decisions based on up-to-date information, speeding up operational activities.

The post New Features in BigQuery appeared first on insightland.

]]>
New Features in Google Analytics 4 (GA4) https://insightland.org/blog/new-features-in-google-analytics-4-ga4/ Wed, 11 Sep 2024 13:51:29 +0000 https://ins-new.stagenv.dev/?p=3043 GA4 brings back features we loved from GA3, offering a more comfortable and time-saving experience. These are straightforward but handy solutions that make using GA easier and help quickly find what you need.

The post New Features in Google Analytics 4 (GA4) appeared first on insightland.

]]>
“It Used to Be… and Now It’s Back!”

Introduction

GA4 brings back features we loved from GA3, offering a more comfortable and time-saving experience. These are straightforward but handy solutions that make using GA easier and help quickly find what you need.


Benchmarking: Lets you compare your business performance with industry data, conveniently showing how you stack up against competitors without needing external tools.

Plot Rows: Allows you to visualize several rows of data in reports, making quick comparisons possible with just a few clicks.

Improved Reports and Anomaly Detection: Offers more detailed session and transaction reports and automatically detects unusual patterns, helping to spot issues faster without manual data searching.

Key Event Marking: The option to easily mark key events is back, making it simpler to identify important metrics without unnecessary clicks and settings.

The post New Features in Google Analytics 4 (GA4) appeared first on insightland.

]]>
Microsoft Power BI. The key to effective data analysis https://insightland.org/blog/microsoft-power-bi-the-key-to-effective-data-analysis/ https://insightland.org/blog/microsoft-power-bi-the-key-to-effective-data-analysis/#respond Thu, 15 Jun 2023 06:21:12 +0000 https://ins-new.stagenv.dev/microsoft-power-bi-the-key-to-effective-data-analysis/ One of the key factors determining a company’s competitive advantage is the ability and capacity to analyze data quickly, especially the ability to identify key trends and opportunities, and threats inherent in the company’s micro-environment. This naming of opportunities and threats allows for agile action related to shifting advertising budgets, changes in inventory, or launching […]

The post Microsoft Power BI. The key to effective data analysis appeared first on insightland.

]]>
One of the key factors determining a company’s competitive advantage is the ability and capacity to analyze data quickly, especially the ability to identify key trends and opportunities, and threats inherent in the company’s micro-environment.

This naming of opportunities and threats allows for agile action related to shifting advertising budgets, changes in inventory, or launching activities that take advantage of observed market trends and will enable them to be used to build additional revenue.

Intro

This article provides an overview of the key features of Microsoft Power BI, focusing on how they can build business value.

What is Power BI?

Power BI is an all-encompassing analytical solution crafted by Microsoft to facilitate the streamlined processing, analysis, and presentation of business data. This robust tool empowers organizations to make informed decisions and optimize their operational workflows. 

It is crucial to emphasize that these decisions are rooted in real insights derived from the data, rather than relying on subjective intuition. Therefore, integrating Power BI into an organization’s framework can be instrumental in fostering a data-driven culture.

Power BI is a component of Microsoft’s latest product, Microsoft Fabric.

Integrate different data sources

Power BI possesses remarkable capabilities in harmonizing data from diverse sources, encompassing ERP systems, financial and accounting solutions, Excel spreadsheets, email marketing platforms, social media networks, and real-time streaming data, as well as services hosted on reputable cloud providers such as Microsoft Azure, AWS, and Google Cloud.

One of its key strengths lies in its ability to consolidate and unify these disparate data streams, leveraging common denominators, such as time, to construct comprehensive analyses. It enables businesses to gain insights from their sales systems, including online store databases, while concurrently incorporating data on advertising expenditures from platforms like Facebook Ads and Google Ads, alongside production and warehousing costs. The resulting consolidated reports provide a comprehensive overview, facilitating informed decision-making and strategic planning for organizations.

Variety of data formats

Power BI offers support for diverse data sources and caters to various data types in a technical context. It means that we can seamlessly connect to data stored in SQL, no-SQL, and various file formats and leverage available services through APIs.

Moreover, Power BI provides the flexibility to utilize scripting connectors built on the R language or Python, empowering users to establish customized connections to their preferred data sources. This functionality opens up endless possibilities for integrating and analyzing data, further enhancing the adaptability and versatility of the Power BI platform.

Clear and intuitive report panel

Establishing a robust data connection unlocks the potential of leveraging Power BI’s comprehensive and user-friendly reporting interface.

Within this interface, we can construct interactive reports that offer dynamic filtering across various dimensions, such as time and product-related characteristics. It includes factors like the product category, customer type, series, issue number, traffic source, or advertising campaign ID.

Regarding data presentation, we are not limited to traditional bar or line charts. Power BI provides a rich selection of visualization options, including waterfall charts that display positive and negative changes in our budget’s final value, funnel charts that illustrate the progression of the purchase process on our website, treemaps that accurately depict the sales proportions across different product categories and map-based visualizations that showcase sales volumes per store within a specific geographical area.

A key advantage is displaying sales results from multiple markets or product categories in a single view, utilizing diverse visualization types. It facilitates swift comparative analysis and enables the identification of potential areas of concern, thus empowering businesses to address issues proactively.

The knowledge available in real time

Through adept data source construction and report design, Power BI enables access to real-time data, presenting exceptional opportunities for timely decision-making based on insights derived from data just minutes old rather than relying on outdated information from the previous day.

Consider scenarios like an intensive advertising campaign, a Black Friday sale, or a prime-time TV commercial. These events create an immediate impact, eliciting specific purchasing behaviors within our target audience. By promptly observing and analyzing these behaviors, we can optimize revenue generation through swift adjustments to our offerings or by implementing additional purchase incentives. For instance, we can leverage time-sensitive promotions limited to the next two hours or effectively utilize relevant social proof by highlighting the increasing number of users who have already purchased the product.

By capitalizing on real-time data availability, businesses can capitalize on fleeting opportunities and make data-driven decisions that maximize their revenue potential.

Advanced data analysis features

Power BI also offers many advanced analytical features, such as the built-in DAX programming language, which allows the creation of complex calculations and data analysis, such as sales prediction, RFM analysis, or segment and cohort analysis.

In addition, within Power BI, we can use Data Science libraries available in Python and present the result obtained from them in the Power BI interface by using the built-in “Python Visual” control.

Confidentiality and data security

Power BI allows you to apply confidentiality labels using Microsoft Information Protection, which helps you better protect your organization’s data. Power BI’s security is further strengthened by integrating Azure services, such as Azure Private Link and Azure Virtual Network, to secure data in the cloud.

Integration with Excel

Any Microsoft 365 platform user can easily link Excel queries, data models, and reports to Power BI dashboards to accelerate the collection, analysis, publishing, and sharing of Excel business data within an organization.

Summary

Power BI is a formidable analytical tool that empowers users to conduct efficient data analysis, generate insightful reports, and present results in a clear and accessible manner. This capability equips organizations to make informed and improved business decisions, optimize operational processes, and effectively monitor prevailing market trends. By harnessing the capabilities of Power BI, businesses gain a competitive edge in their ability to leverage data-driven insights for enhanced performance and strategic advantage.

The post Microsoft Power BI. The key to effective data analysis appeared first on insightland.

]]>
https://insightland.org/blog/microsoft-power-bi-the-key-to-effective-data-analysis/feed/ 0
Don’t sleep on the change: Migrating from GA3 to GA4 https://insightland.org/blog/dont-sleep-on-the-change-migrating-from-ga3-to-ga4/ https://insightland.org/blog/dont-sleep-on-the-change-migrating-from-ga3-to-ga4/#respond Thu, 09 Mar 2023 05:52:32 +0000 https://ins-new.stagenv.dev/dont-sleep-on-the-change-migrating-from-ga3-to-ga4/ As of this writing, there are approximately 120 days left until Universal Analytics (GA3) stops collecting new data. The July 1, 2023 deadline to migrate to the newer Google Analytics 4 (GA4) is fast approaching, leaving website owners and digital marketers with little time to migrate their data. With the recent release of Google Analytics […]

The post Don’t sleep on the change: Migrating from GA3 to GA4 appeared first on insightland.

]]>
As of this writing, there are approximately 120 days left until Universal Analytics (GA3) stops collecting new data. The July 1, 2023 deadline to migrate to the newer Google Analytics 4 (GA4) is fast approaching, leaving website owners and digital marketers with little time to migrate their data. With the recent release of Google Analytics 4, it’s imperative that professionals migrate from the previous version, Universal Analytics (Google Analytics 3), to take advantage of the new and improved features. However, many professionals may not yet be considering a migration.

It’s important to understand that Google Analytics 4 is not just an upgrade from GA3; it’s an entirely new platform with an entirely new data model and user interface. This means that the migration process requires you to learn how to use the new platform and adjust your reporting and analysis strategies accordingly.

That being said, it’s critical to approach the migration process with a strategic and thoughtful mindset. Let’s explore the key factors to consider when migrating from GA3 to GA4.

Revising business goals when migrating to Google Analytics 4

Developing a comprehensive migration plan that includes a timeline with realistic deadlines and measurable milestones is critical to staying organized and on track throughout the transition. By breaking the process down into manageable steps, such as data tracking implementation, data validation, and data analysis, you can ensure that the migration is completed efficiently and effectively.

As you prepare to migrate from Google Analytics 3 to Google Analytics 4, it’s a prime opportunity to conduct a comprehensive revision of your KPIs and business goals. By taking the time to map out your organization’s needs and priorities, you can assess whether your current tracking is effectively capturing the data you need.

By reviewing your existing KPIs and business goals, you can ensure that they are aligned with your overall strategy and objectives. This can also help you identify any gaps or areas where data is missing or incomplete, providing insight into potential opportunities for improvement.

Understanding GA3 vs GA4 event tracking for a successful migration

When migrating from Google Analytics 3 to Google Analytics 4, it’s important to understand the differences between the event tracking capabilities of the two platforms.

In GA3, event tracking was accomplished using a category, action and label structure. Categories were used to group related events, actions described the type of interaction, and labels provided additional context. GA4, on the other hand, uses a parameter-based event tracking system.

In GA4, events are created by sending a combination of predefined and custom parameters to Google Analytics. These parameters provide information about the event, such as the action that occurred, the category of the event, and any relevant metadata.

To translate your GA3 event tracking to GA4, you’ll need to identify the key parameters that correspond to the Category, Action, and Label fields in GA3. You’ll also need to ensure that any custom dimensions or metrics previously used in GA3 map correctly to the corresponding parameters in GA4.

Event-based analysis in GA4 provides more freedom and flexibility in defining and tracking events. However, this approach requires a strategic and holistic perspective, as custom event tracking must be aligned with overall business goals and objectives. To effectively use event-based analytics in GA4, it is necessary to identify the key actions that users take on a website. This requires a deep understanding of the site’s user flow and how users interact with different elements. The process of event tracking requires a strategic approach that considers the overall goals of the website and the business strategy. By taking a holistic approach to event-based analytics, organizations can leverage the full power of GA4 to gain insight into user behavior. This data-driven approach to decision making can improve overall site performance and optimize the user experience.

The Importance of Exporting Historical Data

Exporting historical data from Google Analytics 3 can be a valuable step in the migration process. It allows you to maintain access to a historical view of your website’s performance even after migrating to GA4. This can be especially important for organizations that rely heavily on historical data for trend analysis, forecasting, or other data-driven decision-making processes.

It’s important to note that data from Universal Analytics (GA3) will not be migrated to GA4. While historical reports will remain accessible on the Google Analytics 3 dashboard for six months after the end date (July 1, 2023), you will permanently lose access to your data after that point.

Reporting considerations when migrating to GA4

When migrating from GA3 to GA4, reporting is a critical component that requires careful consideration. The differences between the two platforms in terms of metrics and approach to analysis require a corresponding shift in reporting strategy. It’s important to evaluate your current reporting framework and determine how it needs to evolve to fit the new data model.

One of the biggest challenges in reporting migration is the source of the data used to create reports. GA4 has tighter API restrictions, making it more difficult to retrieve data through Looker Studio. The best solution is to use BigQuery as it provides a reliable and scalable way to store and access data.

BigQuery is a cloud-based data warehouse solution that is fully integrated with Google Analytics 4. One of the main benefits of using BigQuery with GA4 is its ability to quickly and efficiently handle massive amounts of data, providing deeper insights into your website and user behavior.

It’s also important to note the impact of the new GA4 data model on reporting. GA4’s event-based tracking differs from GA3’s session-based tracking, requiring a fresh approach to reporting. Therefore, it’s vital to re-evaluate your KPIs and reporting templates to align them with the new data model. This will ensure that your reporting is accurate and relevant, providing you with valuable insight into user behavior and site performance.

Summary

In summary, while the transition from Google Analytics 3 to Google Analytics 4 may seem daunting, a methodical approach can make the process more manageable. By creating a checklist of items to migrate, developing a migration plan with milestones, conducting a thorough audit, and revising reporting and KPIs to align with the new data model, the migration can be completed successfully..

The post Don’t sleep on the change: Migrating from GA3 to GA4 appeared first on insightland.

]]>
https://insightland.org/blog/dont-sleep-on-the-change-migrating-from-ga3-to-ga4/feed/ 0
Integration of Google Analytics 4 and BigQuery. How much does it cost? Calculator https://insightland.org/blog/integration-of-google-analytics-4-and-bigquery-how-much-does-it-cost-calculator/ https://insightland.org/blog/integration-of-google-analytics-4-and-bigquery-how-much-does-it-cost-calculator/#respond Tue, 21 Feb 2023 10:35:39 +0000 https://ins-new.stagenv.dev/integration-of-google-analytics-4-and-bigquery-how-much-does-it-cost-calculator/ BigQuery in the Google Analytics ecosystem BigQuery is a database service provided by Google, one of its Google Cloud Platform products. BigQuery is a serverless, scalable cloud data warehouse service. It is designed to handle large data sets along with a large number of SQL queries. Characteristic features of BigQuery include the use of: Here […]

The post Integration of Google Analytics 4 and BigQuery. How much does it cost? Calculator appeared first on insightland.

]]>
BigQuery in the Google Analytics ecosystem

BigQuery is a database service provided by Google, one of its Google Cloud Platform products.

BigQuery is a serverless, scalable cloud data warehouse service. It is designed to handle large data sets along with a large number of SQL queries.

Characteristic features of BigQuery include the use of:

  • BI Engine – a built-in, powerful in-memory analysis function to handle large volumes of queries in real time
  • BigQuery ML – a machine learning function for building models using SQL
  • BigQuery Omni – enables queries from other databases within the BigQuery environment.
  • Data QnA – enables formulation of free-form text questions to data sets
  • Connected Sheets – enables building a connection between the BigQuery set and a copy of the data in Google Sheets
  • Geospatial data – enables the use of data for GEO-type services

Here is a video introducing the service:

Along with Google Analytics 4, a native integration between GA4 and BigQuery has been introduced. As a result, all data captured by Google Analytics is streamed to the BigQuery database (of course, if we complete the appropriate integration => here you can read how to do this).

At the same time, Google has restricted access to Google Analytics 4 via API. Thus, connection to Looker Studio via the GA4 connector will also be limited because of those restrictions.

This means that all Looker Studio reports should be switched to the BigQuery connector as the main report feed. Otherwise, you can expect some difficulties in working with Looker Studio reports, especially if they contain “significant” amounts of charts

For more information on the GA4 API limits, go to the following addresses:

BigQuery. How much does it cost? Billing rules

As part of the Google Cloud Platform, BigQuery is also a paid service.

BigQuery pricing has two main components – as illustrated in the figure below: 

BigQuery

Storage: data storage pricing

The pricing for the storage space used, varies slightly, depending on the region / zone where we want to keep our data, e.g.:

  • When selecting the Europe (eu) zone, the cost of active local storage is => $0.02 per GB
  • When selecting the Zürich (europe-west6) zone, the cost of active local storage is => $0.025 per GB
  • When selecting the Madrid (europe-southwest1) zone, the cost of active local storage is => $0.029 per GB
  • When selecting the Frankfurt (europe-west3) zone, the cost of active local storage is => $0.023 per GB

Full price list for storage space used for the Frankfurt (europe-west3) zone:

Obraz zawierający tekst

Opis wygenerowany automatycznie

The distinction between Active storage and Long-term storage is also important:

  • Active storage => Includes any table or table partition that has been modified in the last 90 days
  • Long-term storage => Includes any table or table partition that has not been modified for 90 consecutive days.

What is important – there is no difference in performance or availability between active and long-term storage.

Illustrative calculation of storage cost

So let’s try to do a BigQuery cost calculation (cost per month) for a few selected storage capacities (for the Europe – EU zone):

  • for 100 MB for half a month, you pay $0.001  
  • for 500 GB for half a month, you pay $5
  • for 1 TB for a full month, you pay $20
  • for 10 TB for a full month, you pay $200
  • for 100 TB for a full month, you pay $2000

Case study 1

Google Analytics on the website generates 15,000,000 events per month. Work out an estimate of BigQuery’s storage cost for the coming year.

An example of a storage cost calculation for a website generating 15,000,000 events per month:

Number of month Amount storage GB Price in USD 
Month 1 25.00 0.57 
Month 2 50.00 1.15 
Month 3 75.00 1.72 
Month 4 100.00 2.30 
Month 5 125.00 2.88 
Month 6 150.00 3.45 
Month 7 175.00 4.02 
Month 8 200.00 4.60 
Month 9 225.00 5.17 
Month 10 250.00 5.75 
Month 11 275.00 6.32 
Month 12 300.00 6.90 

So the full-year cost of space (storage) will be about $44.85.

Case study 2

Google Analytics on the website generates 150,000,000 events per month. Work out an estimate of BigQuery’s storage cost for the coming year.

An example calculation for a website generating 150,000,000 events per month:

Number of month Amount storage GB Price in USD 
Month 1 250.00 5.75 
Month 2 500.00 11.50 
Month 3 750.00 17.25 
Month 4 1000.00 23.00 
Month 5 1250.00 28.75 
Month 6 1500.00 34.50 
Month 7 1750.00 40.25 
Month 8 2000.00 46.00 
Month 9 2250.00 51.75 
Month 10 2500.00 57.50 
Month 11 2750.00 63.25 
Month 12 3000.00 69.00 

So the full-year cost for storage will be USD 448.5.


Capacity (storage) cost calculator

Enter the monthly number of requests in Google Analytics:

Where can I find the monthly number of requests?

  • Universal Analytics: Admin => Property Settings
  • Google Analytics 4: Reports => Engagement => Events 

Analysis Pricing: price for running queries/operations to the database

Price for analytical queries run => what does it mean?

Analysis pricing is the cost to process queries, including SQL queries, user-defined functions, scripts, and certain data manipulation language (DML) and data definition language (DDL) statements that scan tables.

So it’s a cost that is in direct proportion to our analytic activities – the more we query a dataset, the higher this cost will be.

Google Cloud Platform offers a choice of two alternative pricing approaches:

  • Model 1 => On-demand pricing (variable cost, calculated per query)
  • Model 2 = > Flat-rate pricing (fixed cost, i.e. a flat rate – price for reserving a resource)

On-demand pricing => Price for queries run

With this pricing model, you are charged for the number of bytes processed by each query. The first 1 TB of query data processed per month is free.

The rate may vary depending on the region / zone.

  • The cost of 1 TB is $5 for the Europe EU region
  • The cost of 1 TB is $6.50 for the Frankfurt (europe-west3) region
  • The cost of 1 TB is $7 for the Zürich (europe-west6) region

It is important to emphasize that this cost is paid for data queries. So, if you don’t do any work (queries) with the data, there is no cost. What’s more – before each database query – you can check the cost of such a query, and then cancel (modify) it if you see that the cost is too high.

Excerpt from Google BigQuery documentation for the Frankfurt (europe-west3) region:

An illustrative cost calculation according to the calculator found on Google’s website:

https://cloud.google.com/products/calculator#id=023b45c8-46d2-468d-b580-df2dc334a237

Calculation for the Frankfurt (europe-west3) region:

If we run queries that download a total of 2 TB of data, our cost per month will be:

  • $6,50 per month (we effectively pay for 1 TB, because the first 1 TB is free)

If we run queries that download a total of 10 TB of data, our cost per month will be:

  • $58,50 per month (we effectively pay for 9 TB, because the first 1 TB is free)

Flat-rate pricing => Price for reservation of fixed processing capacity

With this pricing model, you purchase Slots.

What is a slot?

Slot is a term used in the context of the BigQuery service.

This is the unit of processing capacity that is used by BigQuery to execute queries. Slots are allocated to queries based on their complexity and processing capacity requirements. The more slots are allocated to a query, the faster it is executed.

The minimum commitment is 100 slots or a multiple of that number (200, 300, …, 1000 etc.). There is no limit to the number of commitments you can have.

Slots can be purchased as the following plans (commitments):

A. Flat-rate pricing. Flex plan=> this is a short-term commitment for up to 60 seconds

Flex plan pricing example for Frankfurt (europe-west3):

=> $5.2 per hour of operation, 100 slots available

Obraz zawierający tekst

Opis wygenerowany automatycznie

B. Monthly flat-rate commitment => this is a monthly commitment

With a monthly commitment, you pay for a certain number of slots for one month and then per second until you remove the commitment or convert it to an annual commitment plan.

The price list for a monthly commitment, in the Frankfurt (europe-west3) zone, is:

The important thing is:

You cannot delete (cancel) a monthly commitment plan before 30 days after the plan (commitment) becomes active.

However, after 30 days, we get the option to delete it at any time, and we will only be charged for the seconds during which our commitment was active.

If we do not close (delete) the plan, it will continue in the per-second billing model (plan – commitment – Flex, described above).

C. Annual flat-rate commitment => this is an annual commitment (365 days)

With the annual commitment, we pay for a certain number of slots for one year. 

Monthly price list, based on the annual plan, for the Frankfurt (europe-west3) location:

After one year, the annual commitment is converted to a Monthly flat-rate commitment by default, although we can define a different type of conversion, e.g. continue with an annual plan or convert it to the Flex commitment.

Does it pay to use long-term commitments?

With monthly and annual plans, you get a lower price in exchange for a long-term performance commitment. This can be seen, for example, when comparing the cost per month in the monthly commitment and the annual commitment.

The post Integration of Google Analytics 4 and BigQuery. How much does it cost? Calculator appeared first on insightland.

]]>
https://insightland.org/blog/integration-of-google-analytics-4-and-bigquery-how-much-does-it-cost-calculator/feed/ 0
How Data Visualization Leads to Making Better Business Decisions https://insightland.org/blog/how-data-visualization-leads-to-making-better-business-decisions/ https://insightland.org/blog/how-data-visualization-leads-to-making-better-business-decisions/#respond Thu, 14 Oct 2021 09:32:26 +0000 https://ins-new.stagenv.dev/how-data-visualization-leads-to-making-better-business-decisions/ Nowadays more companies see the pressing need to take a data-driven approach to their business. Big data usage and data-driven became the popular advantages organizations praise. But with more data, we are able to collect, the harder it is to read the important one measurements and to derive valuable insights. We still thriving to make […]

The post How Data Visualization Leads to Making Better Business Decisions appeared first on insightland.

]]>
Nowadays more companies see the pressing need to take a data-driven approach to their business. Big data usage and data-driven became the popular advantages organizations praise. But with more data, we are able to collect, the harder it is to read the important one measurements and to derive valuable insights. We still thriving to make actual use of the collected data. That is why it is so helpful in the world of numbers. Data visualization leads to conducting insights easier and as a result, making better business decisions. 

In this article, I will share with you:

  • Why data-driven decisions are better for your business
  • How data visualization can help in decision-making
  • What are the benefits of data visualization
  • What tools you can use to visualize your data.

So, let’s get into data visualization and its benefits for your business! 

Data-Driven Decisions Make Your Business Better

Data-driven is a really trendy phrase nowadays. It stands for collecting data, measuring goals, analyzing patterns and derive insights into the particular process or project in the company. Moreover, it helps in making better business decisions, like to develop a new strategy for one of the services your business provides. Or, to remove a few products from your offer because they do not drive interest and sales.

The essence of a data-driven approach lies in consistency and willingness to grow wisely. 

Thanks to it, companies can create new opportunities for themselves, optimize current actions, produce more revenue and predict future users’ behaviors and trends. Moreover, it allows your business to be more flexible. Companies that incorporate a data-driven approach usually treat information as a valuable asset and thrive better than their competition.

Data Visualization in Making Decisions – How Does It Help?

All kinds of visualizations like pictures and graphs make everything easier to comprehend. That is the reason behind creating data visualizations. They allow analysts to derive more and better insights. A great variety of information forms can complement each other in graphic form. Thanks to them we can easily identify patterns, point out the areas with potential, indicate the weakest spots, and most importantly recognize the best formulas. Your business can benefit a lot with well-created data visualizations. 

To make the most of data visualization, you need to take care of a few important aspects:

  • The data must be shown with the background information that puts the graphs into context – that is how everyone will be able to understand everything easily.
  • Data visualizations should present key data in a way that indicates the clear course of actions.

When data visualization includes these aspects, analysts are able to put insights into action.

Extra Benefits of Data Visualization

Above all, data visualization, beyond making better decisions, can improve many aspects of work in your organization.

1. Fast Response Times

Data visualization gives the data to the users’ allowing them to quickly identify issues and improve response or reaction times. It is a major advantage for companies that want to save time and react faster being more flexible.

2. Improved Work Simplicity 

Visualizations allow users to get the big picture and allow them to see the details at the same time. When users interact only with relevant data, their work is simplified. They are able to focus on what is important at the moment.

3. Easy Patterns Recognition

Try to find a pattern while checking hundreds of lines in an Excel spreadsheet or a few different data sets. It is just almost impossible, and it takes a lot of time! When you have a visualization of data, you can better absorb it and easily recognize new paths. As a result, you are able to identify patterns, new trends, and even predictions. 

4. Improved Collaboration Between Teams

When all of the teams at your company can see the same data visualizations that are easy to understand, they are more often on the same page regarding the insights conducted by the data. As a result, it makes collaboration between teams smoother as they are starting with the same knowledge of collected data. Therefore, they can decide on the next steps and discover solutions more quickly. 

5. Combining Data from Various Sources

And last but definitely not least, data visualization allows you to see data from various sources. It is difficult to see if your Facebook and Instagram campaigns are successful-looking only at Facebook Business Manager analytics. So if you would like to compare them with data from your CRM, Google Analytics or other sources, it is best to see them on a well-done and easy to comprehend data visualization.

Data Visualization Solutions

You know already that your business needs data visualization to make better decisions but you are not sure how to incorporate it into your company. There is a great variety of different solutions and tools you can use. Most of them are easy to access, therefore you can start using them really soon! Let’s see what solutions your business can acquire. 

Tableau Software

Tableau is an interactive data visualization software. It allows you to simplify raw data into an understandable format. You can create data visualization in the form of dashboards and worksheets. It is commonly used in the Business Intelligence Industry. Although the tool is not free of charge, it is worth the investment. Moreover, Tableau encourages you to join its community and learn even more about data visualization. You can share your visualizations online or on the server – you can choose if it is public or not. If you are not convinced, take a look at our data visualizations created with Tableau.

Microsoft Power Bi

Power BI is a business analytics service provided by Microsoft. It allows you to create interactive visualizations. What is more, it provides business intelligence capabilities with an interface simple enough for end-users to create their own reports and dashboards. Power Bi is not a free tool but it allows you to create a lot for a reasonable price.

Google Data Studio

This is a free tool provided by Google. Data Studio gives you everything you need to turn your analytics data into informational, easy to comprehend reports through data visualization. The reports are easy to read and share, customizable to each of your clients if you work with more than one company. Moreover, you can choose how you want to showcase the data – bar graphs, charts, line graphs, etc. You are able to change fonts and colors and brand the reports with the logo. It is one of the most common tools analysts work with nowadays. In addition, if you want to know more about Data Studio, check out my blog posts on this tool.

Qlik

Qlik is a business intelligence and visual analytics platform you can try for free. It is an interactive data visualization tool that enables users to import and aggregate data from different data sources. They can further use the data visualization tools of the software to shape raw data into meaningful information. The two main products QlikView and Qlik Sense serve different purposes running on the same engine. In QlikView, the users pursue their day-to-day tasks, analyzing data with a slightly configurable dashboard, most of the data is somehow “pre-canned”. On the other hand, Qlik Sense allows associating different data sources and fully configuring the visualizations, allowing to follow an individual discovery path through the data. The company’s official definition is “QlikView is for guided analytics; Qlik Sense is for self-service visualizations

D3.js

If you want to have full control of your data visualization or your application, d3.js is a perfect choice. It is the best data visualization library. D3.js runs on JavaScript and uses HTML, CSS, and SVG. It is an open-source and applies a data-driven transformation to a webpage and allows you to create quickly beautiful visualizations. D3.js combines powerful visualization and interaction techniques with a data-driven approach to DOM manipulation. Therefore, it gives you the full capabilities of modern browsers and the freedom to design the right visual interface for your data. It also provides great features for interactions and animations.

Python as a Data Visualization Tool

Python is a programming language commonly used by data analysts. You can create with it amazing data visualizations. What is more, it is designed with features to facilitate data analysis and visualization. This programming language offers multiple great graphing libraries that come packed with lots of different features. No matter if you want to create interactive, live or highly customized plots, Python has an excellent solution for you and your business.

To get a little overview, take a look at a few popular plotting libraries:

Incorporate Smart Data-Driven Approach to Your Company

Data visualization enhances analytical thinking, deriving insights and making quick data-driven decisions. It is just a wise way to improve a lot of aspects of your business so it can thrive better. Data visualization is essential nowadays. Especially, if you operate in a highly competitive market or industry. If you are not sure about it yourself, seek for the analytical partner that can introduce you to data visualization. Just do not neglect the power of data visualization and its benefits! 

The post How Data Visualization Leads to Making Better Business Decisions appeared first on insightland.

]]>
https://insightland.org/blog/how-data-visualization-leads-to-making-better-business-decisions/feed/ 0
Can Data Studio beat Tableau or Power BI? https://insightland.org/blog/can-data-studio-beat-tableau-or-power-bi/ https://insightland.org/blog/can-data-studio-beat-tableau-or-power-bi/#respond Wed, 13 Oct 2021 09:27:00 +0000 https://ins-new.stagenv.dev/can-data-studio-beat-tableau-or-power-bi/ The number of data visualization tools is huge and is continuously growing. The most significant players who have been dominating the Gartner’s rankings for many years are Tableau and Microsoft Power BI. These platforms are undisputed leaders that give the user excellent data processing capabilities and create unique dashboards. However, can their position be threatened […]

The post Can Data Studio beat Tableau or Power BI? appeared first on insightland.

]]>
The number of data visualization tools is huge and is continuously growing. The most significant players who have been dominating the Gartner’s rankings for many years are Tableau and Microsoft Power BI.

These platforms are undisputed leaders that give the user excellent data processing capabilities and create unique dashboards.

However, can their position be threatened by Data Studio created by Google? We have three arguments supporting this thesis that can convince you.

Started at the bottom, now we’re here.

Data Studio has been on the market for several years, and we have been observing its development from the very beginning.

Starting from the most straightforward charts and a small number of connectors, to the possibility of blending data from many sources, visualizations, and connectors created by external suppliers, calculated fields allowing for more and more advanced calculations, to the latest addition in the form of parameters.

Data Studio is becoming an increasingly powerful tool, which, in several cases, has a significant advantage over other data visualization solutions.

#1 – It’s free

The simplest advantage – the price, or exactly the lack of it. Although Tableau has a free public version, and Power BI can be used without paying for the desktop version, Data Studio stays free without almost any restrictions.

You only need to pay when you use connectors that are provided by third-party sites. In any other scenario, you don’t need to take a credit card out of our wallet.

This is a massive advantage of the tool created by Google – other platforms can cost a lot.

Here in Hexe Data, we believe that the paid version of Data Studio will arise in the future. Of course, no one is talking about it yet, but such a development may seem logical.

When we look at what happened with Google Analytics and creating its paid version for users with more significant needs and more data, such expansion of Data Studio could occur.

Additional functions and faster data processing could appear for an additional fee. It would be a natural step for Google to become more competitive with other data visualization tools.

#2 – It’s Google

The second reasonably obvious argument that makes Data Studio beat the rest of the visualization tools is its mother brand – Google.

If your company mainly uses the tools of this provider, you have nothing to think about. The choice is simple. Google Analytics, Google Sheets, Google Surveys, Google Ad Manager, Google Ads, Google Cloud Storage, Google Search Console…

It’s easy to find their common factor, right? Therefore, no other program will provide the same data connectivity from these sources as Google Data Studio.

This is especially true when you use data from Google Analytics. Other platforms extract data using the API and are therefore limited in the number of dimensions and metrics you can use at once.

Data Studio provides unlimited integration with Google Analytics, including the ability to sync the segments you have created in your web analytics tool panel view.

#3 – It’s simple

Using Tableau or Power BI, we can create mind-blowing reports. Stunning charts, the ability to add custom elements, perform complex calculations, animations, and give the recipient of our report the ability to explore data by clicking on various places and filters in the dashboard.

But, all these fantastic features of advanced tools often remain useless. Even when creating the most complex dashboards, we most often use standard bar charts, line charts, or scorecards.

Data Studio is intuitive. It does not require any complicated implementation or involvement of the IT department.

Leonardo da Vinci was probably right when he said that simplicity is the ultimate sophistication. After all, data visualization aims to show information and numbers in a way that is easier to understand.

Sometimes the most straightforward solutions are simply the best.

David vs. Goliath

So is it possible for Data Studio to beat tools like Tableau or Power BI? Does simplicity have a chance to win over complexity?

At first glance, it might seem that the Google tool will be crushed like a little ant. This tool is currently not adapted to work with large data sets.

There is no desktop version, sometimes it processes data for a long time, after applying a filter at the view level, sometimes we have to wait even a few minutes to see the changed statistics.

But, if we look at the three incredibly important arguments we wrote about earlier, we can imagine a situation when Data Studio is announced the winner. We will follow this fight in the future.

Photo by Markus Winkler on Unsplash

The post Can Data Studio beat Tableau or Power BI? appeared first on insightland.

]]>
https://insightland.org/blog/can-data-studio-beat-tableau-or-power-bi/feed/ 0