Are you over 18 and want to see adult content?
More Annotations
A complete backup of miningglobal.com
Are you over 18 and want to see adult content?
A complete backup of printablecalendartemplates.com
Are you over 18 and want to see adult content?
Favourite Annotations
A complete backup of https://altermed.ru
Are you over 18 and want to see adult content?
A complete backup of https://mgccc.edu
Are you over 18 and want to see adult content?
A complete backup of https://ketodietstar.com
Are you over 18 and want to see adult content?
A complete backup of https://acb.com.vn
Are you over 18 and want to see adult content?
A complete backup of https://gcvcc.org
Are you over 18 and want to see adult content?
A complete backup of https://stoudts.com
Are you over 18 and want to see adult content?
A complete backup of https://votoenblanco.com
Are you over 18 and want to see adult content?
A complete backup of https://swarm.org
Are you over 18 and want to see adult content?
A complete backup of https://jobscan.co
Are you over 18 and want to see adult content?
A complete backup of https://autofreak.com
Are you over 18 and want to see adult content?
A complete backup of https://commag.org
Are you over 18 and want to see adult content?
A complete backup of https://swedbank.ee
Are you over 18 and want to see adult content?
Text
MARKEDMONDSON.ME
AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREE Introduction. The aim of this post is to give you the tools to enhance your Google Analytics data with R and present it on-line using Shiny. By following the steps below, you should have your own on-line GA dashboard, with these features: MY GOOGLE ANALYTICS TIME SERIES SHINY APP (ALPHA) It'll take you to the Google account screen, where you say its ok to use the data (if it is), and copy-paste the token it then displays. This token allows the app (but not me) process your data. Go back to the app and paste the token in the box. Wait about 10 seconds, depending on how many accounts you have in your Google Analytics. GOOGLE TAG MANAGER SERVER SIDE ON CLOUD RUN Google Tag Manager Server Side on Cloud Run - Pros and Cons. One of the most exciting developments in 2020 for me is the launch of Google Tag Manager Server Side, which lies at the intersection of cloud and digital analytics that I’ve gravitated towards in recent years. There are many good resources out there on GTM Serverside, Simo in CREATING YOUR OWN COOKIELESS ANALYTICS TOOL WITH GTM This is an example of how GTM server side can be used to create your own digital analytics tool. It’s a proof of concept of what you can do given the power of GTM serverside and its BigQuery integration. I customise the stream by adding cookieless tracking, displaying the data in Shiny and running it all on Cloud Run to keep costs down but performance good. I shall dub this tool Edmonlytica. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. AUTHENTICATE WITH GOOGLE CLOUD STORAGE API The best way to authenticate is to use an environment argument pointing at your authentication file, making this function unnecessary. Set the file location of your download Google Project JSON file in a GCS_AUTH_FILE argument. Then, when you load the library you should auto-authenticate. However, you can authenticate directlyusing this
INTRODUCING GOOGLECLOUDRUNNER Google Cloud Platform’s managed Knative service is Cloud Run, which reached general availability recently and is now, via googleCloudRunner, my preferred service for R APIs. You can read more about this via this presentation on R at scale on Google Cloud Platform I made in New York in September 2019. Along the way, a few key highlights of theR ON KUBERNETES
Why run R on Kubernetes? Kubernetes is a free and open-source utility to run jobs within a computer cluster. It abstracts away the servers the jobs are running on so you need only worry about the code to run. It has features such as scheduling, auto-scaling, and auto-healing to replace nodes if they breakdown.. If you only need to run R on a single machine, then its probably a bit OTT to use GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREE Introduction. The aim of this post is to give you the tools to enhance your Google Analytics data with R and present it on-line using Shiny. By following the steps below, you should have your own on-line GA dashboard, with these features: MY GOOGLE ANALYTICS TIME SERIES SHINY APP (ALPHA) It'll take you to the Google account screen, where you say its ok to use the data (if it is), and copy-paste the token it then displays. This token allows the app (but not me) process your data. Go back to the app and paste the token in the box. Wait about 10 seconds, depending on how many accounts you have in your Google Analytics. GOOGLE TAG MANAGER SERVER SIDE ON CLOUD RUN Google Tag Manager Server Side on Cloud Run - Pros and Cons. One of the most exciting developments in 2020 for me is the launch of Google Tag Manager Server Side, which lies at the intersection of cloud and digital analytics that I’ve gravitated towards in recent years. There are many good resources out there on GTM Serverside, Simo in CREATING YOUR OWN COOKIELESS ANALYTICS TOOL WITH GTM This is an example of how GTM server side can be used to create your own digital analytics tool. It’s a proof of concept of what you can do given the power of GTM serverside and its BigQuery integration. I customise the stream by adding cookieless tracking, displaying the data in Shiny and running it all on Cloud Run to keep costs down but performance good. I shall dub this tool Edmonlytica. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. AUTHENTICATE WITH GOOGLE CLOUD STORAGE API The best way to authenticate is to use an environment argument pointing at your authentication file, making this function unnecessary. Set the file location of your download Google Project JSON file in a GCS_AUTH_FILE argument. Then, when you load the library you should auto-authenticate. However, you can authenticate directlyusing this
INTRODUCING GOOGLECLOUDRUNNER Google Cloud Platform’s managed Knative service is Cloud Run, which reached general availability recently and is now, via googleCloudRunner, my preferred service for R APIs. You can read more about this via this presentation on R at scale on Google Cloud Platform I made in New York in September 2019. Along the way, a few key highlights of theR ON KUBERNETES
Why run R on Kubernetes? Kubernetes is a free and open-source utility to run jobs within a computer cluster. It abstracts away the servers the jobs are running on so you need only worry about the code to run. It has features such as scheduling, auto-scaling, and auto-healing to replace nodes if they breakdown.. If you only need to run R on a single machine, then its probably a bit OTT to use GENTELELLA SHINY BY MARKEDMONDSON1234 There is no support for different layouts other than the defaults in gentelellaPage(), in that case use shiny::htmlTemplate directly and edit index.html to include your R code blocks in {{ brackets }}. Example files. A demo app for viewing GoogleAnalytics data is available by running runExample().. It will start up a login page and a (logged out) Shiny dashboard. FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it CREATING YOUR OWN COOKIELESS ANALYTICS TOOL WITH GTM This is an example of how GTM server side can be used to create your own digital analytics tool. It’s a proof of concept of what you can do given the power of GTM serverside and its BigQuery integration. I customise the stream by adding cookieless tracking, displaying the data in Shiny and running it all on Cloud Run to keep costs down but performance good. I shall dub this tool Edmonlytica. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE TAG MANAGER SERVER SIDE ON CLOUD RUN Google Tag Manager Server Side on Cloud Run - Pros and Cons. One of the most exciting developments in 2020 for me is the launch of Google Tag Manager Server Side, which lies at the intersection of cloud and digital analytics that I’ve gravitated towards in recent years. There are many good resources out there on GTM Serverside, Simo in INTRODUCING GOOGLECLOUDRUNNER Google Cloud Platform’s managed Knative service is Cloud Run, which reached general availability recently and is now, via googleCloudRunner, my preferred service for R APIs. You can read more about this via this presentation on R at scale on Google Cloud Platform I made in New York in September 2019. Along the way, a few key highlights of the SHINY ON GOOGLE CLOUD RUN The above means that your Shiny app is limited: the number of concurrent requests you can have to one container in Cloud Run is 80 connections. This means you lose the “scale-to-a-billion” feature, as on concurrent request 81 no container will be available to serve it. It also means the app won’t autoscale as the normal Cloud Runsetups
R ON KUBERNETES
Why run R on Kubernetes? Kubernetes is a free and open-source utility to run jobs within a computer cluster. It abstracts away the servers the jobs are running on so you need only worry about the code to run. It has features such as scheduling, auto-scaling, and auto-healing to replace nodes if they breakdown.. If you only need to run R on a single machine, then its probably a bit OTT to use FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. LAUNCH RSTUDIO SERVER IN THE GOOGLE CLOUD WITH TWO LINES I’ve written previously about how to get RStudio Server running on Google Compute Engine: the first in July 2014 gave you a snapshot to download then customise, the second in April 2016 launched via a Docker container. Things move on, and I now recommend using the process below that uses the RStudio template in the new on CRAN googleComputeEngineR package. Not only does it abstract RUN RSTUDIO SERVER ON A CHROMEBOOK AS A CLOUD NATIVE I recently got an Asus Chromebook Flip with which I’m very happy, but it did make me realise that if a Chromebook was to replace my normal desktop as my primary workstation, my RStudio Server setup would need to be more cloud native than was available up until now. TL;DR - A how-to on making RStudio Server run on a Chromebook that automatically backs up data and configuration LOGIN/LOGOUT SHINY OUTPUT Login/logout Shiny output. loginOutput.Rd. USe within a ui.R to render the login button generated by renderLogin. loginOutput ( output_name) ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLES Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHR Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so this SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it CREATING YOUR OWN COOKIELESS ANALYTICS TOOL WITH GTM This is an example of how GTM server side can be used to create your own digital analytics tool. It’s a proof of concept of what you can do given the power of GTM serverside and its BigQuery integration. I customise the stream by adding cookieless tracking, displaying the data in Shiny and running it all on Cloud Run to keep costs down but performance good. I shall dub this tool Edmonlytica. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. GOOGLE TAG MANAGER SERVER SIDE ON CLOUD RUN Google Tag Manager Server Side on Cloud Run - Pros and Cons. One of the most exciting developments in 2020 for me is the launch of Google Tag Manager Server Side, which lies at the intersection of cloud and digital analytics that I’ve gravitated towards in recent years. There are many good resources out there on GTM Serverside, Simo in INTRODUCING GOOGLECLOUDRUNNER Google Cloud Platform’s managed Knative service is Cloud Run, which reached general availability recently and is now, via googleCloudRunner, my preferred service for R APIs. You can read more about this via this presentation on R at scale on Google Cloud Platform I made in New York in September 2019. Along the way, a few key highlights of the SHINY ON GOOGLE CLOUD RUN The above means that your Shiny app is limited: the number of concurrent requests you can have to one container in Cloud Run is 80 connections. This means you lose the “scale-to-a-billion” feature, as on concurrent request 81 no container will be available to serve it. It also means the app won’t autoscale as the normal Cloud Runsetups
R ON KUBERNETES
Why run R on Kubernetes? Kubernetes is a free and open-source utility to run jobs within a computer cluster. It abstracts away the servers the jobs are running on so you need only worry about the code to run. It has features such as scheduling, auto-scaling, and auto-healing to replace nodes if they breakdown.. If you only need to run R on a single machine, then its probably a bit OTT to use FIVE WAYS TO SCHEDULE R SCRIPTS ON GOOGLE CLOUD PLATFORM A common question I come across is how to automate scheduling of R scripts downloading data. This post goes through some options that I have played around with, which I’ve mostly used for downloading API data such as Google Analytics using the Google Cloud platform, but the same principles could apply for AWS or Azure. LAUNCH RSTUDIO SERVER IN THE GOOGLE CLOUD WITH TWO LINES I’ve written previously about how to get RStudio Server running on Google Compute Engine: the first in July 2014 gave you a snapshot to download then customise, the second in April 2016 launched via a Docker container. Things move on, and I now recommend using the process below that uses the RStudio template in the new on CRAN googleComputeEngineR package. Not only does it abstract RUN RSTUDIO SERVER ON A CHROMEBOOK AS A CLOUD NATIVE I recently got an Asus Chromebook Flip with which I’m very happy, but it did make me realise that if a Chromebook was to replace my normal desktop as my primary workstation, my RStudio Server setup would need to be more cloud native than was available up until now. TL;DR - A how-to on making RStudio Server run on a Chromebook that automatically backs up data and configuration LOGIN/LOGOUT SHINY OUTPUT Login/logout Shiny output. loginOutput.Rd. USe within a ui.R to render the login button generated by renderLogin. loginOutput ( output_name) ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLESGOOGLE ANALYTICS REPORTING APIGOOGLE ANALYTICS API DOCUMENTATIONGOOGLE ANALYTICS API EXAMPLEGOOGLE ANALYTICS DATA APIC# GOOGLE ANALYTICS APIV4 SEGMENTBRAIN
Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHRGOOGLE AUTHENTICATIONAPPLICATION
Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
EFFICIENT ANTI-SAMPLING WITH THE GOOGLE ANALYTICSSEE MORE ON CODE.MARKEDMONDSON.ME SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it RUN RSTUDIO SERVER ON A CHROMEBOOK AS A CLOUD NATIVEHOW TO USE A CHROMEBOOK LAPTOPCHROMEBOOK DOWNLOAD FREEHOW TO DOWNLOAD APPS ON SAMSUNG CHROM…HOW TO DOWNLOAD CHROMEBOOK OSHOW TO DOWNLOAD CHROMEBOOK OSHOW TO FIND APPS ON CHROMEBOOK I recently got an Asus Chromebook Flip with which I’m very happy, but it did make me realise that if a Chromebook was to replace my normal desktop as my primary workstation, my RStudio Server setup would need to be more cloud native than was available up until now. TL;DR - A how-to on making RStudio Server run on a Chromebook that automatically backs up data and configuration ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREESEE MORE ONMARKEDMONDSON.ME
GOOGLE ANALYTICS API INTO R • GOOGLEANALYTICSR googleAnalyticsR. Welcome to the website for googleAnalyticsR, an R library for working with Google Analytics data.. Follow development on the project’s Github development site.. The Slack group googleAuthRverse includes a #googleAnalyticsR channel. For news, chat and support join via this request form.. Collaboration is welcomed and encouraged, if you are interested get in touch. AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R AT SCALE ON THE GOOGLE CLOUD PLATFORM · MARK EDMONDSON R at scale on the Google Cloud Platform. This post covers my current thinking on what I consider the optimal way to work with R on the Google Cloud Platform (GCP). It seems this has developed into my niche, and I get questions about it so would like to be able to point to a URL. Both R and the GCP rapidly evolve, so this will have to beupdated
GOOGLE ANALYTICS REPORTING API V4 IN R EXAMPLESGOOGLE ANALYTICS REPORTING APIGOOGLE ANALYTICS API DOCUMENTATIONGOOGLE ANALYTICS API EXAMPLEGOOGLE ANALYTICS DATA APIC# GOOGLE ANALYTICS APIV4 SEGMENTBRAIN
Google Analytics Reporting API v4 in R Examples. The v4 API supports Universal Analytics. For working with Google Analytics 4 (App+Web) use the new Data API. The v4 API currently has these extras implemented over the v3 API: Check out more examples on using the API in actual use cases on the www.dartistics.com website. OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keep searching :) Before the below fix I tried: Pressing the increase brightness buttons (duh) Restarting in safe mode (doesn't complete login) Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttonson powerup
GOOGLE AUTHENTICATION TYPES FOR R • GOOGLEAUTHRGOOGLE AUTHENTICATIONAPPLICATION
Quick user based authentication. Once setup, then you should go through the Google login flow in your browser when you run this command: library ( googleAuthR ) # starts auth process with defaults gar_auth () #>The googleAuthR package is requesting access to your Google account. Select a #> pre-authorised account or enter '0' toobtain a new token.
EFFICIENT ANTI-SAMPLING WITH THE GOOGLE ANALYTICSSEE MORE ON CODE.MARKEDMONDSON.ME SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it RUN RSTUDIO SERVER ON A CHROMEBOOK AS A CLOUD NATIVEHOW TO USE A CHROMEBOOK LAPTOPCHROMEBOOK DOWNLOAD FREEHOW TO DOWNLOAD APPS ON SAMSUNG CHROM…HOW TO DOWNLOAD CHROMEBOOK OSHOW TO DOWNLOAD CHROMEBOOK OSHOW TO FIND APPS ON CHROMEBOOK I recently got an Asus Chromebook Flip with which I’m very happy, but it did make me realise that if a Chromebook was to replace my normal desktop as my primary workstation, my RStudio Server setup would need to be more cloud native than was available up until now. TL;DR - A how-to on making RStudio Server run on a Chromebook that automatically backs up data and configuration ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREE Introduction. The aim of this post is to give you the tools to enhance your Google Analytics data with R and present it on-line using Shiny. By following the steps below, you should have your own on-line GA dashboard, with these features: AUTHENTICATE AND CREATE GOOGLE APIS • GOOGLEAUTHR R Google API libraries using googleAuthR. Here is a list of available Google APIs to make with this library. The below libraries are all cross-compatible as they use googleAuthR for authentication backend e.g. can use just one OAuth2 login flow and can be used in multi-user Shiny apps. googleComputeEngineR - Google Compute Engine VMs API. R SCRIPTS IN THE GOOGLE CLOUD VIA CLOUD RUN, CLOUD BUILD Ambition. Select an R file, and have it scheduled in the cloud with a couple of clicks. Deploy your plumber API code automatically on Cloud Run to scale from 0 (no cost) to millions (auto-scaling) Integrate R inputs and outputs with other languages in a serverless cloud environment. Have R code react to events such as GitHub pushes,pub/sub
SHINY ON GOOGLE CLOUD RUN The above means that your Shiny app is limited: the number of concurrent requests you can have to one container in Cloud Run is 80 connections. This means you lose the “scale-to-a-billion” feature, as on concurrent request 81 no container will be available to serve it. It also means the app won’t autoscale as the normal Cloud Runsetups
EFFICIENT ANTI-SAMPLING WITH THE GOOGLE ANALYTICS Efficient anti-sampling with the Google Analytics Reporting API. Avoiding sampling is one of the most common reasons people start using the Google Analytics API. This blog lays out some pseudo-code to do so in an efficient manner, avoiding too many unnecessary API calls. The approach is used in the v4 calls for the R package googleAnalyticsR. CALL GOOGLES NATURAL LANGUAGE API, CLOUD TRANSLATION API Language tools for R via Google Machine Learning APIs. Read the introduction blogpost on rOpenSci’s blog. This package contains functions for analysing language through the Google Cloud Machine Learning APIs. Note all are paid services, you will need to provide your credit card details for your own Google Project to use them. SETTING UP GOOGLE ANALYTICS API DOWNLOADS TO R Once you are logged in, issue the following two R commands in the R console bottom-left: library ( googleAnalyticsR) ga_auth () Select 1: Yes to say you wish to keep your OAuth access credentials. The library should then launch a browser window and ask you to login to Google - log in with an email that has access to your Google Analytics - it CREATE A GOOGLE LOGIN BEFORE YOUR SHINY UI LAUNCHES Create a Google login before your Shiny UI launches. Source: R/shiny-modifyurl.R. gar_shiny_ui.Rd. A function that will turn your ui object into one that will look for Google authentication before loading the main app. Use together with gar_shiny_auth. SERVERLESS BATCHED R SCRIPTS VIA CLOUD BUILD Cloud Build uses Docker containers to run everything. This means it can run almost any language/program or application including R. Having an easy way to create and trigger these builds from R means R can serve as a UI or gateway to any other program e.g. R can trigger a Cloud Build using gcloud to deploy Cloud Run applications.. The first 120 mins per day are free. DATA PRIVACY ENGINEERING WITH GOOGLE TAG MANAGER SERVER A new interest at the moment is engineering through various data privacy requirements with some of the new tools Google has available. With respect to my guiding principle of blogging about what I wish I could have read 6 months ago, I thought it worth writing about what is now possible. Data privacy engineering also requires clear thinking about the legal and technical details and so thisManage New Post
Logout
Login
MARK EDMONDSON
AN ENGLISHMAN IN COPENHAGEN WRITING ABOUT DIGITAL, MUSIC AND ANYTHINGELSE.
2019 RESOLUTIONS
I intended this blog to cover everything else that was not code covered on https://code.markedmondson.me but as you can see that never happened. This probably points to some unbalance in my life, and I hope for 2019 to address that. There is a lot else going on. In 2018 we moved to a dream house with a garden that gets me out a bit, and has also helped give space for some music to come back. The new house is hopefully a move that benefits all the family, even though we got it as Sanne and I's careers waxed and waned it should be net positive in lots of ways. A 2018 resolution was to travel as much with the family as I do with work, and that has largely panned out and has been positive, so this year I'll try it with blogging - one post here about non-code for every post about code at the other place. The intended audiences for this blog is anyone who cares whilst for the code blog a more professional output. Another habit I got back into in 2018 was reading. From the age of around 4 to 30ish I hardly did anything else aside read but I dropped out of the habit for some reason, perhaps just the amount of reading I did at work. After setting a target of one book a month and 20mins a day (a goal that would have been child's play at age 16) I found my way out of the reading funk. Its a guaranteed way to relax, if choosing a book not too much like work. I'm on Goodreadsif you would like
to join me, and I hope to write about some of the new thoughts that have arisen from those. Music I get a lot more practice mooching about in my new study, but lose access to the music bunker, so not sure how that will go. But I'm a better player now at least. I'll put anything that is half-finished on Soundcloud here . Gardening posts?! Stranger things have happened. Religion/Politics etc. Yes, I should do this as I've sworn of touching it on Twitter decrying it as a terrible medium for politics in particular. This place should be better for long-formconsiderations.
Upvote Upvoted 1
Posted 2 years ago
BIGQUERY VISUALISER SHINY APP NOW FREE AND OPEN SOURCED A few weeks ago I tweeted a beta version of a BigQuery Visualiser Shiny app that was well received, and got some valuable feedback on how it could be improved, in particular from @felipehoffa -thanks Felipe!
Here is a screenshot of the app:MOTIVATION
The idea of the app is to enhance the standard BigQuery interface to include plots of the data you query. It uses ggplot2 , a popular R library;d3heatmaps , a d3
JavaScript library to display heatmaps; and timelyportfolio'slistviewer , a nice
library for viewing all the BigQuery meta data in a collapsible tree. Other visualisations can be added fairly easily and will be done so over time, but if you have a request for something in particular you can raise an issue on the project's Github page.
I got into BigQuery once it started to receive exports from Google Analytics Premium. Since these exports carry unsampled raw data and include unique userIds, its a richer data source for analysis than the Google Analytics reporting API. It also was a chance to create another Google API library called bigQueryR , the newest member of the googleAuthRfamily. Using
googleAuthR meant Shiny support, and also meant bigQueryR can be used alongside googleAnalyticsRand
searchConsoleR
under one shared login flow. This is something exploited in thisdemo of RMarkdown
,
which pulls data from all three sources into a scheduled report. RUNNING YOUR OWN BIGQUERY VISUALISER All set-up instructions are listed on the BigQuery Visualiser's Githubproject page
.
You can run the Shiny app locally on your computer within RStudio; within your own company intranet if its running Shiny Server;or publicly
like the original app on shinyapps.io .FEEDBACK
Please let me know what else could improve. I have a current pending issue on using JSON uploads for authentication that is waiting a bug update in httr, the underlyinglibrary.
In particular all the htmlwidgets() packages could be added - this wonderful R library creates an R to d3.js interface, which holds some of the nicest visualisations on theweb.
In this first release, I favoured plots that could apply to as much different data sets as possible. For your own use cases you can be more restrictive on what data is requested, and so maybe more ambitious in the plots. If you want inspiration timelyportfolio (he who wrote the listviewer library) has a blog where he makes lots of htmlwidgets libraries . Enjoy! Hope its of use, let me know if you build something cool withit.
TAGS
* analytics
* bigQueryR
* r
Upvote Upvoted 4
Posted 6 years ago
INTRODUCTION TO MACHINE LEARNING WITH WEB ANALYTICS: RANDOM FORESTSAND K-MEANS
MEASURECAMP #7
I've just come back from #MeasureCamp , where I attended some great talks: on hierarchical models; the process of
analysis
;
a demo of Hadoop processing Adobe Analytics hits; web scraping with
Python and how machine learning will affect marketing in the future. Unfortunately the sad part of MeasureCamp is you also miss some excellent content when they clash, but that's the nature of an ad-hoc schedule. I also got to meet some excellent analytics bods and friends old and new. Many thanks to all the organisers! MY SESSIONS ON MACHINE LEARNING After finishing my presentation I discovered I would need to talk waaay to quickly to fit it all in, so I decided to do a session on each example I had. The presentation is now available online here, so you
can see what was intended. I got some great feedback, as well as requests from people who had missed the session for some details, so this blog post will try to fill in some detail around the presentation we spoke about in thesessions.
SESSION 1: INTRODUCTION, GOOGLE ANALYTICS DATA AND RANDOM FORESTEXAMPLE
INTRODUCTION
_Machine Learning gives ability for programs to learn without being explicitly programmed for a particular dataset. __They make models from input data to create useful output, commonly predictive analytics. (Arthur Samuel via Wikipedia)_ There are plenty of machine learning resources, but not many that deal with web analytics in particular. The sessions are aimed at inspiring web analysts to use or add machine learning to their toolbox, showing two machine learning examples that detail: * What data to extract * How to process the data ready for the models* Running the model
* Viewing and assessing the results * Tips on how to put into production Machine learning isn't magic. You may be able to make a model that uses obscure features, but a lot of intuition will be lost as a result. Its much better to have a model that uses features you can understand, and scales up what a domain expert (e.g. you) could do if you had the time to go through all the data. TYPES OF MACHINE LEARNING Machine learning models are commonly split between supervised and unsupervised learning. We deal with an example from each: * SUPERVISED: Train the model against a test set with known outcomes. Examples include spam detection and our example today, classifying users based on what they eventually buy. The model we use is known as Random Forests. * UNSUPERVISED: Let the model find own results. Examples include clustering of users that we do in the second example using the k-meansmodel.
Every machine learning project needs the below elements. They are not necessarily done in order but a successful project will need to incorporate them all: * POSE THE QUESTION - This is the most important. We pose a question that our model needs to answer. We also review this question and may modify it to try and fit what the data can do as we work on the project. * DATA PREPARATION - This is the majority of work. It covers getting hold of the data, munging it so it fits the model and parsing the results. I've tried to include some R functions below that will help with this, including getting the data from Google Analytics intoR.
* RUNNING THE MODEL - The sexy statistics part. Whilst superstar statistics skills is helpful to get the best results, you can still get useful output when applying model defaults which we use today. Important thing is to understand the methods. * ASSESSING THE RESULTS - What you’ll be judged on. You will of course have a measure of how accurate the model is, but an important step is visualising this and being able to explain the model to non-technical people. * HOW TO PUT IT INTO PRODUCTION - the ROI and business impact. A model that just runs in your R code on your laptop may be of interest, but ultimately not as useful for the business as a whole if it does not recommend how to implement the model and results into production. Here you will probably need to talk to IT about how to call your model, or even rewrite your prototype into a more productionlevel language.
PITFALLS USING MACHINE LEARNING IN WEB ANALYTICS There are some considerations when dealing with web analytics data inparticular:
* WEB ANALYTICS IS MESSY DATA - definitions can vary from website to website on various metrics, such as unique users, sessions or pageviews, so a through understanding of what you are working with isessential.
* MOST PRACTICAL ANALYSIS NEEDS ROBUST UNIQUE USERIDS - For useful actionable output, machine learning models need to work on data that record useful dimensions, and for most websites that is your users. Unfortunately that is also the definition that is the most woolly in web analytics given the nature of different access points. Having a robust unique userID is very useful and made the examples in this blogpost possible.
* TIME-SERIES TECHNIQUES ARE QUICKEST WAY IN - If you don't have unique users, then you may want to look at time-series models instead, since web analytics is also a lot of count data over time. This is the reason I did GA Effect as one of my first data apps, since it could apply to most situationsof web analytics.
* CORRELATING CONFOUNDERS - It can be common for web analytics to be recording highly correlating metrics e.g. PPC clicks and cost. Watch out for these in your models as they can overweight results. * SELF REINFORCING RESULTS - Also be wary of applying models that will favour their own results. For example, a personalisation algo that places products at the top of the page will naturally get more clicks. To get around this, consider using weighted metrics, such as a click curve for page links. Always test. * DO REGULARISATION - Make sure all metrics are on the same scale, otherwise some will dominate. e.g. pageviews + bounce rate in samemodel
THE SCENARIO
Here is the situation the following examples are based upon. Hopefully it will be something familiar to your own case: _You are in charge of a reward scheme website, where existing customers log in to spend their points.__ You want users to spend as many points as they can, so they have high perceived value.__ You capture a unique userId on login into custom dimension1 and use Google Analytics enhanced e-commerce to track which prizes users view andclaim._
Notice this scenario involves the reliable user ID, since every user is logging in to use the website. This may be tricky to do on your own website, so you may need to only work with a subset of your users. In my view, the data gains you can make from reliable user identification means I try to encourage the design of the website to involve logged in content as much as possible.RANDOM FORESTS
Now we get into the first example. Random Forests are a popular machine learning tool as it typically has good results - in Kaggle competitions its often the benchmark tobeat.
Random Forests are based on decision trees, and decision trees are the topic of a recent interactive visualisation on machine learningthat has
been doing the rounds. Its really great, so check it out first thencome back here.
Back? Ok great, so now you know about decision trees. RANDOM FORESTS are a simple extension, as a collection of decision trees are a Random Forest. A problem with decision trees is that they will overfit your data - when you throw new data at it you will get misclassification. It turns out though, that if you aggregate all the decision trees with subsets of your original data, all those slightly worse models added up make one robust model, meaning when you throw new data at a Random Forest its more likely to be a closerfit.
If you want more detail check out the very readable original paper byBreiman and Cutler
and
a tutorial on using it with Ris here.
EXAMPLE 1: CAN WE PREDICT WHAT PRIZES A USER WILL CLAIM FROM THEIRVIEW HISTORY?
Now we are back looking at our test scenario. We have noticed that a lot of user's aren't claiming prizes despite browsing the website, and we want to see if we can encourage them to claim prizes, so they value the points more and spend more to get them. We want to look at users who do claim, and see what prizes they look at before they claim. Next we will see if we can build a model to predict what a user will claim based on their view history. In production, we will use this to e-mail users who have viewed but not claimed prize suggestions, to see if it improves uptake.FETCHING THE DATA
Use your favourite Google Analytics to R library - I'm using my experimental new library, googleAnalyticsR, but it
doesn't matter which, the important thing is looking at what is being fetched. In this example the user ID is being captured in custom dimension 1, and we're pulling out the product SKU code. This is transferable to other web analytics such as Adobe Analytics (perhaps via the RSiteCatalystpackage)
library(googleAnalyticsR_public) gar_auth(new_user=T) ## your profile view Idid <- "XXXXXX"
## 61607 results
## 30049 unique Ids
## 185 Sku's
product_views <- google_analytics(id, "2015-08-07", "2015-09-01", metrics = 'productDetailViews', dimensions = c('productSku','dimension1'), samplingLevel = "WALK")## 8855 results
## 6336 unique Ids
## 169 Sku's
product_trans <- google_analytics(id, "2015-08-07", "2015-09-01", metrics = c('itemRevenue','uniquePurchases'), dimensions = c('transactionId','productSku','dimension1'))view raw
ml_get_data.R
hosted with ❤ by GitHub Note we needed two API calls to get the views and transactions as these can't be queried in the same call. They will be merged later. TRANSFORMING THE DATA We now need to put the data into a format that will work with Random Forests. We need a matrix of predictors to feed into the model, one column of response showing the desired output labels, and we split it so it is one row per user action: Here is some R code to "widen" the data to get this format. We then split the data set randomly 75% for training, 25% for testing. ## want: 30049 x 187 ## userId, product1_view, product2_view, ...., productN_view,productBought
pv <- reshape2::recast(product_views, dimension1 ~ productSku + variable,fun.aggregate=sum)
library(dplyr)
## if a user buys more than once, the row will be duplicated pt <- product_trans %>% select(productSku, dimension1) model_data <- left_join(pv, pt)## NAs are no sale
model_data$boughtSku <- "NoSale" ## splitting into training and test: ## 75% of the sample size smp_size <- floor(0.75 * nrow(model_data)) ## set the seed to make your partition reproductibleset.seed(123)
train_ind <- sample(seq_len(nrow(model_data)), size = smp_size)## split the data
train <- model_data
test <- model_data
## what to use in the modelpredictors <- train
response <- as.factor(train)view raw
ml_trans_data.R
hosted with ❤ by GitHub RUNNING RANDOMFOREST AND ASSESSING THE RESULTS We now run the model - this can take a long time for lots of dimensions (this can be much improved using PCA for dimension reduction, see later). We then test the model on the test data, and get an accuracy figure: library(randomForest) ## warning - can take a long time (30mins) rf <- randomForest(x = predictors, y = response) ## once model done, we run it using test data and compare resultsto reality
predictor_test <- test response_test <- as.factor(test) ## check result on test set prediction <- predict(rf, predictor_test) predictor_test$correct <- as.character(prediction) == as.character(response_test) ## How many were correct? table(as.character(prediction) == as.character(response_test)) accuracy <- sum(predictor_test$correct) / nrow(predictor_test)view raw
ml_model_assess_randomForest.R hosted with ❤ by GitHub On my example test set I got ~70% accuracy on this initial run, which is not bad, but it is possible to get up to 90-95% with some tweaking. Anyhow, lets plot the test vs predicated product frequencies, to see how it looks: ## function to get plot data format getCompareTable <- function (test_data, prediction) {require(dplyr)
## plot real vs model bought Sku actual_freq <- table(model_data$boughtSku) predicted_freq <- table(prediction) actual_freq <- actual_freq predicted_freq <- predicted_freq actual_freq_s <- data.frame(sku = names(actual_freq), actual = as.vector(actual_freq), stringsAsFactors = F) predicted_freq_s <- data.frame(sku = names(predicted_freq), predict = as.vector(predicted_freq), stringsAsFactors = F) actual_freq_s$actual <- unname(actual_freq_s$actual) predicted_freq_s$predict <- unname(predicted_freq_s$predict) compare <- dplyr::left_join(actual_freq_s, predicted_freq_s, by ="sku")
compare
}
## use function to get plot data compare <- getCompareTable(test, prediction) ## plot the predicted vs actual in test setlibrary(ggplot2)
library(reshape2)
compare_long <- melt(compare) g <- ggplot(data = compare_long, aes(x=sku, y = value, colour = variable, group = variable)) + theme_bw() g <- g + geom_bar(stat = "identity", position = "dodge",aes(fill=variable))
g
view raw
ml_plot_randomForests.R hosted with ❤ by GitHub This outputted the below plot. It can be seen in general the ~70% accuracy predicted many products but with a lot of error happening for a large outlier. Examining the data this product SKU was for a cash only prize. A next step would be to look at how to deal with this product in particular since eliminating it improves accuracy to ~85%in one swoop.
NEXT STEPS FOR THE RANDOMFOREST There I stop but there are lots of next steps that could be done to make the model applicable to the business. A non-exhaustive list is: * Run model on more test sets * Train model on more data * Try reducing number of parameters (see PCA later) * Examine large error outliers * Compare with simple models (last/first product viewed?) - complicated is not always best! * Run model against users who have viewed and not sold yet * Run email campaign with control and model results for finaljudgement
It is hoped the above inspired you to try it yourself. SESSION 2: K-MEANS, PRINCIPAL COMPONENT ANALYSIS AND SUMMARY EXAMPLE 2: CAN WE CLUSTER USERS BASED ON THEIR VIEW PRODUCT BEHAVIOUR? Now we look at k-means clustering. The questions we are trying to answer are something like this: _Do we have suitable prize categories on the website? How do our website categories compare to user behaviour?_ The k-means clustering we hope will give us data to help with decisions on how the website is organised. For this we will use the same data as we used before for Random Forests, with some minor changes: as k-means is an unsupervised model we will take off our product labels: A lot of this example is inspired by this nice beginners walk-throughon K-means with R
.
INTRODUCTION TO K-MEANS CLUSTERING This video tutorial on k-means explains it well: The above is an example with two dimensions, but k-means can apply to many more dimensions than that, we just can't visualise them easily. In our case we have 185 product views that will each serve as a dimension. However, problems with that many dimensions include long processing time alongside dangers of over-fitting the data, so we nowlook at PCA.
PRINCIPAL COMPONENT ANALYSIS (PCA) We perform Principal Component Analysis (PCA) to see if there are important products that dominate model - this could have been applied to previous Random Forest example as well, and indeed a final production model could include output from one model like k-means to be fed into Random Forests. PCA rotates dimensions to try and minimize them as much as possible, then ranks them in amount of variance. There is a good visualisationof this here .
The clustering we will do will actually be performed on the top rotated dimensions we find via PCA, and we will then map these back to the original pages for final output. This also takes care of situations such as if one product is always viewed in every cluster: PCA will minimize this dimension. The code below looks for the principal components, then gives us some outputs to try and decide how many dimensions we will choose. A rule of thumb is we look for components that give us roughly ~85% of the variance. For the below data this was actually 35 dimensions (reduced from the 185 before) ## Finding number of components pc <- princomp(model_data)plot(pc, type="l")
# look for dimension that is ~ 85% variancesummary(pc)
loadings(pc)
# run more convenient pca needed for k-means pc <- prcomp(k_data) ## Chose top 35 dimensions ## limit data to first 35 columns comp <- data.frame(pc$x)view raw
ml_pca.R
hosted with ❤ by GitHub The plot output from the above is below. We can see the first principal component accounts for 50% of the variance, but then the variation is flattish.HOW MANY CLUSTERS?
How many clusters to pick for k-means can be a subjective experience. There are other clustering models that pick for you, but some kind of decision process will be dependent on what you need. There are however ways to help inform that decision. Running the k-means modelling for increasing number of clusters, we can look at an error measure (sum of squares) of how many points are in each. When we plot these attempts for each cluster iteration, we can see how the graph changes or levels off at various cluster sizes, and use that to help with our decision: # Determine number of clusters ## run kmeans for varying number of clusters 1 to 15 wss <- (nrow(comp)-1)*sum(apply(comp,2,var)) for (i in 2:15) wss <- sum(kmeans(comp, centers=i)$withinss) plot(1:15, wss, type="b", xlab="Number of Clusters", ylab="Within groups sum of squares") # From scree plot elbow occurs at k = 4-6 # we'll choose 4 clusters: k <- kmeans(comp, 4, nstart=25, iter.max=1000)view raw
how-many-clusters.R
hosted with ❤ by GitHub The plot for determining the clusters is here - see the fall between 2-4 clusters. We went with 4 for this example, although a case couldbe made for 6:
ASSESSING THE CLUSTERS AND VISUALISATION I find heatmaps are a good way to assess clustering results, since they offer a good way to overview groupings. We are basically looking to see if the clusters found are different enough to makesense.
kResults <- data.frame(k_data, cluster = k$cluster) ## Transform data for columns of cluster, rows of Sku with value ofmean total for each
rl <- as.data.frame(lapply(1:4, function(x){ r3 <-kResults
r4 <- colSums(r3) / nrow(r3)r4
}))
names(rl) <- paste("cluster",1:4) ## plot using d3heatmap librarylibrary(d3heatmap)
d3heatmap(rl, theme="dark", scale = 'row')view raw
heatmap-clusters.R
hosted with ❤ by GitHub This gives the following visualisation. In an interactive RStudio or Shiny session, this is zoomable for finer detail, but here we justexported the image:
From the heatmap we can see that each cluster does have distinctly different product views. K-MEANS - NEXT STEPS The next step is to take these clusters and examine the products that are within them, looking for patterns. This is where your domain knowledge is needed, as all we have done here is grouped together based on statistics - the "why" is not in here. When I've performed this in the past, I try to give named persona to each cluster type. Examples include "Big Spenders" for those who visit the payment page a lot, "Sport Freaks" who tend to only look at sport goods etc. Again, this will largely depend on the number of clusters you have chosen, so you may want to vary this to tweak to the results you are looking for. Recommendations follow on how to group pages: A/B teats can then be performed to test if the clustering makes an impact.SUMMARY
I hope the above example workflows have inspired you to try it with your own data. Both examples can be improved, for instance we took no account of the order of product views or other metrics such as time on website, but the idea was to give you a way in to try theseyourselves.
I chose k-means and Random Forests as they are two of the most popular models, but there are lots to choose from. This diagram from a python machine learning library, scikit-learn,
offers an excellent overview on how to choose which other machine learning model you may want to use for your data: All in all I hope some of the mystery around machine learning has been taken out, and how it can be applied to your work. If you are interested in really getting to grips with machine learning, the Coursera course was excellentand what set me on
my way.
Do please let me know of any feedback, errors or what you have done with the above, I'd love to hear from you.Good luck!
TAGS
* analytics
* r
* machine learning
Upvote Upvoted 3
Posted 6 years ago
GOOGLE API CLIENT LIBRARY FOR R: GOOGLEAUTHR V0.1.0 NOW AVAILABLE ONCRAN
One of the problems with working with Google APIs is that quite often the hardest bit, authentication, comes right at the start. This presents a big hurdle for those who want to work with them, it certainly delayed me. In particular having Google authentication work with Shiny is problematic, as the token itself needs to be reactive and only applicable to the user who is authenticating. But no longer! googleAuthRprovides helper
functions to make it easy to work with Google APIs. And its now available on CRAN (my first CRAN package!) so you can install iteasily by typing:
> install.packages("googleAuthR") It should then load and you can get started by looking at the readmefiles on Github
or typing:
> vignette("googleAuthR") After my experiences making shinygaand searchConsoleR
, I decided
inventing the authentication wheel each time wasn't necessary, so worked on this new R package that smooths out this pain point. googleAuthR provides easy authentication within R or in a Shiny app for Google APIs. It provides a function factory you can use to generate your own functions, that call or do the actions you needed. At last counting there are 83 APIs , many of which have no R library, so hopefully this library can help with that. Examples include the Google Prediction API, YouTube analytics API, Gmail APIetc. etc.
EXAMPLE USING GOOGLEAUTHR Here is an example of making a goo.gl R package using googleAuthR: library(googleAuthR) ## change the native googleAuthR scopes to the one needed. options("googleAuthR.scopes.selected" = c("https://www.googleapis.com/auth/urlshortener")) #' Shortens a url using goo.gl#'
#' @param url URl to shorten with goo.gl#'
#' @return a string of the short URL shorten_url <- function(url){body = list(
longUrl = url
)
f <-
gar_api_generator("https://www.googleapis.com/urlshortener/v1/url","POST",
data_parse_function = function(x) x$id)f(the_body = body)
}
#' Expands a url that has used goo.gl#'
#' @param shortUrl Url that was shortened with goo.gl#'
#' @return a string of the expanded URL expand_url <- function(shortUrl){f <-
gar_api_generator("https://www.googleapis.com/urlshortener/v1/url","GET",
pars_args = list(shortUrl = "shortUrl"), data_parse_function = function(x) x) f(pars_arguments = list(shortUrl = shortUrl))}
#' Get analyitcs of a url that has used goo.gl#'
#' @param shortUrl Url that was shortened with goo.gl #' @param timespan The time period for the analytics data#'
#' @return a dataframe of the goo.gl Url analytics analytics_url <- function(shortUrl, timespan = c("allTime", "month", "week","day","twoHours")){ timespan <- match.arg(timespan)f <-
gar_api_generator("https://www.googleapis.com/urlshortener/v1/url","GET",
pars_args = list(shortUrl = "shortUrl", projection = "FULL"), data_parse_function = function(x) {a <- x$analytics
return(a)
})
f(pars_arguments = list(shortUrl = shortUrl))}
#' Get the history of the authenticated user#'
#' @return a dataframe of the goo.gl user's history user_history <- function(){f <-
gar_api_generator("https://www.googleapis.com/urlshortener/v1/url/history","GET",
data_parse_function = function(x) x$items)f()
}
## To use the above functions: library(googleAuthR) # go through authentication flowgar_auth()
s <- shorten_url("http://markedmondson.me")s
expand_url(s)
analytics_url(s, timespan = "month")user_history()
view raw
url_short.R
hosted with ❤ by GitHub If you then want to make this multi-user in Shiny, then you just need to use the helper functions provided:## in global.R
library(googleAuthR) options("googleAuthR.scopes.selected" = c("https://www.googleapis.com/auth/urlshortener")) shorten_url <- function(url){body = list(
longUrl = url
)
f <-
gar_api_generator("https://www.googleapis.com/urlshortener/v1/url","POST",
data_parse_function = function(x) x$id)f(the_body = body)
}
## in server.R
library(shiny)
library(googleAuthR)source('global.R')
shinyServer(function(input, output, session){ ## Get auth code from return URL access_token <- reactiveAccessToken(session) ## Make a loginButton to display using loginOutput output$loginButton <- renderLogin(session, access_token()) short_url_output <- eventReactive(input$submit, { ## wrap existing function with_shiny ## pass the reactive token in shiny_access_token ## pass other named arguments short_url <- with_shiny(f = shorten_url, shiny_access_token = access_token(),url=input$url)
})
output$short_url <- renderText({short_url_output()
})
})
## in ui.R
library(shiny)
library(googleAuthR)shinyUI(
fluidPage(
loginOutput("loginButton"), textInput("url", "Enter URL"), actionButton("submit", "Shorten URL"), textOutput("short_url")))
view raw
shiny_url_short.R
hosted with ❤ by GitHubTAGS
* programming
* r
Upvote Upvoted 3
Posted 6 years ago
AUTOMATING GOOGLE CONSOLE SEARCH ANALYTICS DATA DOWNLOADS WITH R ANDSEARCHCONSOLER
Yesterday I published version 0.1 of searchConsoleR, a package that
interacts with Google Search Console(formerly Google
Webmaster Tools) and in particular its search analytics.
I'm excited about the possibilities with this package, as this new improved data is now available in a way to interact with all the thousands of other R packages. If you'd like to see searchConsoleR capabilities, I have the package running an interactive demo here (very bare bones, but should demo the data well enough). The first application I'll talk about in this post is archiving data into a .csv file, but expect more guides to come, in particular combining this data with Google Analytics. AUTOMATIC SEARCH ANALYTICS DATA DOWNLOADS The 90 day limit still applies to the search analytics data, so one of the first applications should be archiving that data to make year on year, month on month and general development of your SEO rankings.The below R script:
* Downloads and installs the searchConsoleR package if it isn'tinstalled already.
* Lets you set some parameters you want to download. * Downloads the data via the search_anaytics function. * Writes it to a csv in the same folder the script is run in. * The .csv file can be opened in Excel or similar. ## A script to download and archive Google search analytics##
## Demo of searchConsoleR R package.##
## Version 1 - 10th August 2015##
## Mark Edmondson (http://markedmondson.me) ## load the required libraries ## (Download them with install.packages("googleAuthR") and install.packages("searchConsoleR" if necessary library(googleAuthR) library(searchConsoleR) ## change this to the website you want to download data for.Include http
website <- "http://example.com" ## data is in search console reliably 3 days ago, so we downloadfrom then
## today - 3 days
start <- Sys.Date() - 3 ## one days data, but change it as neededend <- start
## what to download, choose between data, query, page, device,country
download_dimensions <- c('date','query') ## what type of Google search, choose between 'web', 'video' or'image'
type <- c('web')
## other options available, check out ?search_analytics in the Rconsole
## Authorize script with Search Console. ## First time you will need to login to Google, ## but should auto-refresh after that so can be put in ## Authorize script with an account that has access to website.gar_auth()
## first time stop here and wait for authorisation ## get the search analytics data data <- search_analytics(siteURL = website,startDate = start,
endDate = end,
dimensions = download_dimensions,searchType = type)
data
## do stuff to the data ## combine with Google Analytics, filter, apply other stats etc. ## write a csv to a nice filename filename <- paste("search_analytics",Sys.Date(),
paste(download_dimensions, collapse = "",sep=""), type,".csv",sep="-") write.csv(data, filename)view raw
downloadSearchAnalytics.R hosted with ❤ by GitHub This should give you nice juicy data.CONSIDERATIONS
The first time you will need to run the scr_auth() script yourself so you can give the package access, but afterwards it will auto-refresh the authentication each time you run the script. If you ever need a new user to be authenticated, run scr_auth(new_user=TRUE) You may want to modify the script so it appends to a file instead, rather than having a daily dump, although I do this with a folder of .csv's to import them all into one R dataframe (which you could export again to one big .csv) ## assuming you have a folder full of .csv's to merge ## csv's must all have identical column names. folder <- "./path/to/csv/files" filenames <- list.files(folder) all_files <- Reduce(rbind, lapply(filenames, read.csv))view raw
combineCSVfiles.R
hosted with ❤ by GitHubAUTOMATION
You can now take the download script and use it in automated batch files, to run daily. In Windows, this can be done like this (from SO)
* Open the scheduler: START -> All Programs -> Accessories -> SystemTools -> Scheduler
* Create a NEW TASK
* under tab Action, create a NEW ACTION * choose START PROGRAM * browse to RSCRIPT.EXE which should be placed e.g. here: "C:\PROGRAM FILES\R\R-3.2.0\BIN\X64\RSCRIPT.EXE" * input the name of your file in the PARAMETERS field * input the path where the script is to be found in the START INfield
* go to the TRIGGERS tab * create NEW TRIGGER * choose that task should be done EACH DAY, month, ... repeated several times, or whatever you like In Linux, you can probably work it out yourself :)CONCLUSION
Hopefully this shows how with a few lines of R you can get access to this data set. I'll be doing more posts in the future using this package, so if you have any feedback let me know and I may be able to post about it. If you find any bugs or features you would like, please also report an issue on the searchConsoleR issues pageon
Github.
TAGS
* r
* analytics
* seo
Upvote Upvoted 2
Posted 6 years ago
ENHANCE YOUR GOOGLE ANALYTICS DATA WITH R AND SHINY (FREE ONLINEDASHBOARD TEMPLATE)
INTRODUCTION
The aim of this post is to give you the tools to enhance your Google Analytics data with R and present it on-line using Shiny. By following the steps below, you should have your own on-line GA dashboard, with these features:*
Interactive trend graphs.*
Auto-updating Google Analytics data.*
Zoomable day-of-week heatmaps.*
Top Level Trends via Year on Year, Month on Month and Last Month vs Month Last Year data modules.*
A MySQL connection for data blending your own data with GA data.*
An easy upload option to update a MySQL database.*
Analysis of the impact of marketing events via Google's CausalImpact.*
Detection of unusual time-points using Twitter's Anomaly Detection. A lot of these features are either unavailable in the normal GA reports, or only possible in Google Analytics Premium. Under the hood, the dashboard is exporting the data via the Google Analytics Reporting API, transforming it with various R statistical packages and then publishing it on-line via Shiny. A live demo of the dashboard template is available on my Shinyapps.io account with dummy GA data, and all the code used is on Github here.
*
Example app: https://mark.shinyapps.io/GA-dashboard-demo*
Code on Github: https://github.com/MarkEdmondson1234/ga-dashboard-demoFEATURE DETAIL
Here are some details on what modules are within the dashboard. A quick start guide on how to get the dashboard running with your own data is at the bottom.TREND GRAPH
Most dashboards feature a trend plot, so you can quickly see how you are doing over time. The dashboard uses dygraphs javascript library, which allows you to interact with the plot to zoom, pan and shift your date window. Plot smoothing has been provided at the day, week, month and annual level. Additionally, the events you upload via the MySQL upload also appear here, as well as any unusual time points detected as anomalies. You can go into greater detail on these in the Analyse section.HEATMAP
Heatmaps use colour intensity to show metrics between categories. The heatmap here is split into weeks and day per week, so you can quickly scan to see if a particular day of the week is popular - in the below plot, Monday/Tuesday look like they are best days fortraffic.
The data window is set by what you select in the trend graph, and you can zoom for more detail using the mouse.TOP LEVEL TRENDS
Quite often headlines just need a number to quickly check. These data modules give you a quick glance into how you are doing, comparing last week to the week before, last month to the month before and last month to the same month the year before. Between them, you should see how your data is trending, accounting for seasonal variation.MYSQL CONNECTION
The code provides functions to connect to a MySQL database, which you can use to blend your data with Google Analytics, provided you have a key to link them on. In the demo dashboard the key used is simply the date, but this can be expanded to include linking on a userID from say a CRM database to the Google Analytics CID, Transaction IDs to off-line sales data, or extra campaign information to your campaign IDs. An interface is also provided to let end users update the database by uploading a textfile.
CAUSALIMPACT
In the demo dashboard, the MySQL connection is used to upload Event data, which is then used to compare with the Google Analytics data to see if the event had a statistically significant impact on your traffic. This replicates a lot of the functionality of the GA Effectdashboard
.
Headline impact of the event is shown in the summary dashboard tab. If its statistically significant, the impact is shown in blue.ANOMALY DETECTION
Twitter has released this R package to help detect unusual time points for use within their data streams, which is also handy for Google Analytics trend data. The annotations on the main trend plot are indicated using this package, and you can go into more detail and tweak the results in theAnalyse section.
MAKING THE DASHBOARD MULTI-USER In this demo I’ve taken the usual use case of an internal department just looking to report on one Google Analytics property, but if you would like end users to authenticate with their own Google Analytics property, it can be combined with my shinyga() package, which provides
functions which enable self authentication, similar to my GA Effect/Rollup/Meta apps. In production, you can publish the dashboard behind a Shinyapps authentication login (needs a paid plan), or deploy your own Shiny Server to publish the dashboard on your company intranet.QUICK START
Now you have seen the features, the below goes through the process for getting this dashboard for yourself. This guide assumes you know of R and Shiny - if you don’t then start there: http://shiny.rstudio.com/ You don’t need to have the MySQL details ready to see the app in action, it will just lack persistent storage.SETUP THE FILES
*
Clone/copy-paste the scripts in the github repositoryto your own
RStudio project.
*
Find your GA View ID you want to pull data from. The quickest way to find it is to login to your Google Analytics account, go to the View then look at the URL: the number after “p” is the ID.*
Get your MySQL setup with a user and IP address. See next section on how this is done using Google Cloud SQL. You will also need to white-list the IP of where your app will sit, which will be your own Shiny Server or shinyapps.io. Add your local IP for testing too. If using shinyapps.io their IPs are: 54.204.29.251; 54.204.34.9; 54.204.36.75; 54.204.37.78.*
Create a file called secrets.R file in the same directory as the app with the below content filled in with your details.# secrets.R
options(
mysql = list(
"host" = "YOUR SQL IP","port" = 3306,
"user" = "YOUR SQL USER", "password" = "YOUR USER PW", "databaseName" = "onlinegashiny"),rga = list(
"profile_id" = "The GA View ID", "daysBackToFetch" = 356*3),
shinyMulti = list(
"max_plots" = 10
),
myCausalImpact = list("test_time" = 14,
"season" = 7
),
shiny.maxRequestSize = 0.5*1024^2 ## upload only 0.5 MB)
view raw
secrets.R
hosted with ❤ by GitHubCONFIGURING R
1. Make sure you can install and run all the libraries neededby the app:
## functions.R
library(rga)
library(dygraphs)
library(zoo)
library(tidyr)
library(lubridate)
library(d3heatmap)
library(dplyr)
library(stringr)
library(DT)
library(RMySQL)
library(CausalImpact) library(AnomalyDetection) ## Most are on CRAN so accessible via the Install button in RStudio ## but for rga, CausalImpact and AnomalyDetection ## you will need to use devtools to install them from github: > install.packages("devtools")> library(devtools)
> install_github("google/CausalImpact") > install_github("rga", "skardhamar") > install_github("twitter/AnomalyDetection")view raw
functions-needed.R
hosted with ❤ by GitHub 2. Run the below command locally first, to store the auth token in the same folder. You will be prompted to login with the Google account that has access to the GA View ID you put into step 3, and get a code to paste into the R console. This will then be uploaded with app and handle the authentication with Google Analytics when inproduction:
> rga::rga.open(where="token.rga") 3. Test the app by hitting the “Run App” button at the top right of the server.ui script in RStudio, or by running:> shiny::runApp()
USING THE DASHBOARD
*
The app should now be running locally in a browser window with your own GA data. It can take up to 30 seconds for all the data to loadfirst time.
*
Deploy the instance on-line to Shinyapps.io with a free account there, or to your own Shiny Server instance.*
Customise your instance. If for any reason you don’t want certain features, then remove the feature in the ui.R script - the data is only called when the needed plot is viewed. GETTING A MYSQL SETUP THROUGH GOOGLE CLOUD SQL If you want a MySQL database to use with the app, I use Google Cloud SQL. Setup is simple:*
Go to the Google API console and create a project if you need to.*
Make sure you have billing turned on with your billing accounts menutop right.
*
Go to Storage > Cloud SQL in the left hand menu.*
Create a New Instance.*
Create a new Database called “onlinegashiny”*
Under “Access Control” you need to put in the IP of yourself where you test it, as well as the IPs of the Shiny Server/shinyapps.io. If you are using shinyapps.io the IPs are: 54.204.29.251; 54.204.34.9; 54.204.36.75;54.204.37.78*
Under “IP Address” create a static IP (Charged at $0.24 a day)*
You now should have all the access info you need to put in the apps secrets.R for MySQL access. The port should be a default 3306*
You can also limit the amount of data that is uploaded by the shiny.maxRequestSize option - default is 0.5 MB.SUMMARY
Hopefully the above could help inspire what can be done with your Google Analytics data. Focus has been on trying to give you the tools that allow action to be made on your data. There is a lot more you can do via the thousands of R packages available, but hopefully this gives a framework you can build upon. I’d love to see what you build with it, so do please feel free toget in touch. :)
TAGS
* analytics
* programming
* r
Upvote Upvoted 6
Posted 6 years ago
MY NEW ROLE AS GOOGLE DEVELOPER EXPERT FOR GOOGLE ANALYTICS! I'm very pleased and honoured to have been accepted into the Google Developer Expert program representing Google Analytics. I should soon have my mug listed with the other GA GDE's at the Google Developer Expert website.
My thanks go to Simo who nominated me and Linda for helping me through the applicationprocess.
Alongside my existing work at Wunderman, my role should include some more opportunities to get out there and show what can be done with the GA APIs, so expect me at more analytics conferences soon. I also will get to play with some of the new betas and hopefully be able to create more cool demo apps for users to adapt and use for their own website, mostly using R Shiny and Google App Engine.TAGS
* analytics
Upvote Upvoted 1
Posted 6 years ago
HOW I MADE GA EFFECT - CREATING AN ONLINE STATISTICS DASHBOARD USING R GA Effect is a webapp that uses Bayesian structural time-series to judge if events happening in your Google Analytics account are statistically significant. Its been well received on Twitterand how to
use it is detailed in this guest post on Online Behaviour, but
this blog will be about how to build your own or similar. UPDATE 18TH MARCH: I've made a package that holds a lot of the functions below, shinyga . That may be easiestto work with.
WHAT R CAN DO
Now is a golden time for the R community, as it gains popularity outside of its traditional academic background and hits business. Microsoft has recently bought Revolution Analytics,
an enterprise solution of R so we can expect a lot more integration with them soon, such as the machine learning in their Azure platform.
Meanwhile RStudio are releasing more and more packages that make it quicker and easier to create interactive graphics, with tools for connecting and reshaping data and then plotting using attractive JavaScript visualisation libraries or native interactive R plots. GA Effect is also being hosted using ShinyApps.io , an R server solution that enables you to publish straight from your console, or you can run your own server using Shiny Server.
PACKAGES USED
For the GA Effect app, the key components were these R packages:* Shiny ,
ShinyDashboard and
ShinyApps for the web interaction and themes * Dygraphs for the nice plots * rga() for the Google Analyticsconnection
* CausalImpact
for the
statistics
PUTTING THEM TOGETHERWEB INTERACTION
First off, using RStudio makes this all a lot easier as they have a lot of integration with their products. ShinyDashboard is a custom theme of the more general Shiny. As detailed in the getting started guide, creating a
blank webpage dashboard with shinydashboard take 8 lines of R code. You can test or run everything locally first before publishing to the web via the “Publish” button at the top. Probably the most difficult concept to get around is the reactive programming functions in a Shiny app. This is effectively how the interaction occurs, and sets up live relationships between inputs from your UX script (always called ui.R) and outputs from your server side scripts (called server.r). These are your effective front-end and back-end in a traditional web environment. The Shiny packages takes your R code and changes it into HTML5 and JavaScript. You can also import JavaScript of your own if you need it to cover what Shinycan’t.
The Shiny code then creates the UI for the app, and creates reactive versions of the datatables needed for the plots. GOOGLE AUTHENTICATION The Google authentication flow uses OAuth2 and could be used for any Google API in the console , such as BigQuery, Gmail, Google Drive etc. I include the code used for the authentication dance below so you can use it in your own apps: ## GUIDE TO AUTH2 Authentication in R Shiny (or other online apps)##
## Mark Edmondson 2015-02-16 - @HoloMarkeD | http://markedmondson.me##
## v 0.1
##
##
## Go to the Google API console and activate the APIs you need. https://code.google.com/apis/console/?pli=1 ## Get your client ID, and client secret for use below, and put in the URL of your app in the redirect URIs ## e.g. I put in https://mark.shinyapps.io/ga-effect/ for the GAEffect app,
## and http://127.0.0.1:6423 for local testing (start the Shiny App by using this command to force the port: runApp(port=6423)##
## I then have an auth.r file I source which is below##
## auth.r
CLIENT_ID <- "YOUR CLIENT ID" CLIENT_SECRET <- "YOUR CLIENT SECRET" CLIENT_URL <- 'https://your-url-that-picks-up-return-token.com' # CLIENT_URL <- 'http://127.0.0.1:6423' # I comment this out for deployment, in for local testing ### Authentication functions ## generate the URL the user clicks on. ## The redirect URL is then returned to with the extra 'code' and 'state' URL parameters appended to it. ShinyGetTokenURL <- function(client.id = CLIENT_ID, client.secret = CLIENT_SECRET, redirect.uri = CLIENT_URL) { url <- paste('https://accounts.google.com/o/oauth2/auth?', 'scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fanalytics+', ## plus any other scopes you need 'https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fanalytics.readonly&', 'state=securitytoken&', 'redirect_uri=', redirect.uri, '&', 'response_type=code&', 'client_id=', client.id, '&', 'approval_prompt=auto&', 'access_type=online', sep='', collapse='');return(url)
}
## gets the token from Google once you have the code that is in thereturn URL
ShinyGetToken <- function(code, client.id = CLIENT_ID, client.secret = CLIENT_SECRET, redirect.uri = CLIENT_URL){ token <- MErga.authenticate(client.id = client.id, client.secret = client.secret,code = code,
redirect.uri = redirect.uri);return(token)
}
## posts your code to google to get the current refresh MErga.authenticate <- function(client.id, client.secret, code,redirect.uri) {
opts <- list(verbose = FALSE); raw.data <- postForm('https://accounts.google.com/o/oauth2/token',.opts = opts,
code = code,
client_id = client.id, client_secret = client.secret, redirect_uri = redirect.uri, grant_type = 'authorization_code',style = 'POST');
token.data <- fromJSON(raw.data); now <- as.numeric(Sys.time()); token <- c(token.data, timestamp = c('first'=now, 'refresh'=now));return(token);
}
#### end auth.r
#### Then in Shiny these are the appropriate server.r and ui.rfunctions
##
## server.r
#
shinyServer(function(input, output, session) { ### Authentication Functions ##########################################################
## AuthCode() - checks for presence of code in URL ## AccessToken() - creates a token once a code is available ## ShinyMakeGAProfileTable - the table of profiles taken from API ## output$AuthGAURL - creates the authentication URL ## output$GAProfile - table of the profiles belonging to user AuthCode <- reactive({ ## gets all the parameters in the URL. Your authentication code should be one of them pars <- parseQueryString(session$clientData$url_search) if(length(pars$code) > 0){return(pars$code)
}
})
AccessToken <- reactive({validate(
need(AuthCode(), "Authenticate To See"))
access_token <- ShinyGetToken(code = AuthCode()) token <- access_token$access_token})
output$AuthGAURL <- renderUI({ a("Click Here to Authorise Your Google Analytics Access", href=ShinyGetTokenURL())})
ShinyMakeGAProfileTable <- reactive({ token <- AccessToken() ### ... do your call to the Google API with the token .. etc.})
## end server.r
## ui.r just provides the URL for users to click## ui.r
# ...
uiOutput("AuthGAURL")# ...
## end ui.r
view raw
online_google_auth.r hosted with ❤ by GitHub FETCHING GOOGLE ANALYTICS DATA Once a user has authenticated with Google, the user token is then passed to rga() to fetch the GA data, according to which metric and segment the user has selected. This is done reactively, so each time you update the options a new data fetch to the API is made. Shiny apps are on a per user basis and work in RAM, so the data is forgotten once the app closes down. DOING THE STATISTICS You can now manipulate the data however you wish. I put it through the CausalImpact package as that was the application goal, but you have a wealth of other R packages that could be used such as machine learning, text analysis, and all the other statistical packages available in the R universe. It really is only limited by yourimagination.
Here is a link to the CausalImpact paper,
if you really want to get in-depth with the methods used. It includes some nice examples of predicting the impact of searchcampaign clicks.
Here is how CausalImpact was implemented as a function in GA Effect: ## in server.r of a shiny app casualImpactData <- reactive({ ## only if we have the data readyvalidate(
need(chartData(), "Need data"))
data <- chartData()
## from user input in ui.r start <- input$range_date end <- input$range_date event <- input$event_date season <- as.numeric(input$season) ## setting up the necessary data for CausalImpact pre.period <- as.Date(c(start, event)) post.period <- as.Date(c(event + 1, end)) ## doing the CausalImpact call and creating the model data CausalImpact(data, pre.period, post.period, model.args = list(nseasons = season))})
view raw
casualImpact_shiny.r hosted with ❤ by GitHubPLOTTING
dygraphs() is an R package that takes R input and outputs the JavaScript needed to display it in your browser, and as its made by RStudio they also made it compatible with Shiny. It is an application of HTMLwidgets , which lets you take any JavaScript library and make it compatible with R code. Here is an example of how the main result graph wasgenerated:
## in server.r
output$null_plot <- renderDygraph({ ## don't output anything unless you have the data readyvalidate(
need(casualImpactData(), "Model Working"))
## the data for the plot is in here ci <- casualImpactData()$series ## get the start and end from the user input in ui.r start <- input$range_date end <- input$range_date ## we need to convert the dataframe into a timeseries for dygraph orderme <- seq(as.Date(start), as.Date(end), by=1) ci <- xts(ci, orderme) ## the dygraph outputdygraph(data=ci,
main="Expected (95% confidence level) vs Observed", group="ci")%>%
dyRangeSelector(dateWindow = c(input$event_date - 7, input$range_date)) %>% dyEvent(date = input$event_date, "Event") %>% dySeries(c('point.pred.lower', 'point.pred','point.pred.upper'), label='Expected') %>% dySeries('response', label="Observed")})
view raw
dygraph_plot.r
hosted with ❤ by GitHubPUBLISHING
I’ve been testing the alpha of shinyapps.io for a year now, but it is just this month (Feb 2015) coming out of beta. If you have an account, then publishing your app is as simple as pushing “Publish” button above your script, where it appears at a public URL. With other paid plans, you can limit access to authenticatedusers only.
NEXT STEPS
This app only took me 3 days with my baby daughter on my lap during a sick weekend, so I’m sure you can come up with similar given time and experience. The components are all there now to make some seriously great apps for analytics. If you make something do pleaselet me know!
TAGS
* analytics
* programming
* r
Upvote Upvoted 5
Posted 6 years ago
OSX BLACK SCREEN NO LOGIN SCREEN BUT WITH WORKING CURSOR ON BOOT I'm just posting this, to maybe help others who get the same problem. I had an OSX 10.10.2 update on my 2011 Macbook Air, and left the laptop open last night. This put it in Hibernation mode which breaks the auto-installation, so when I tried to use the laptop this morning, it booted to the Apple logo, but then the screen went totally black without the option to login. The cursor was still live though. The fix below will let you login again. It will only work in the above scenario, if its your backlight broken or something else keepsearching :)
Before the below fix I tried: * Pressing the increase brightness buttons (duh) * Restarting in safe mode (doesn't complete login) * Resetting SMC and PRAM (pusing CTRL+OPTION+POWER+other buttons on powerup - see here: https://discussions.apple.com/docs/DOC-3603 ) * Letting it boot, waiting, then pushing first letter of your username, pushing enter and typing in password (the most popular fixon the web)
But finally, the solution was found at this forumcalled
Jamfnation via some Google-wu: * Perform a PRAM reset ( Cmd+Option+P+R ) on boot – let chime 3times and let go
* Boot to Single User Mode (hold Command+S immediately afterpowering on)
* Verify and Mount the Drives - Once in Single user mod, run thefollowing commands:
* /sbin/fsck -fy
* /sbin/mount -uw /
* After the disk has mounted in step 5, run the following commands: * rm -f /Library/Preferences/com.apple.loginwindow.plist * rm -f /var/db/.AppleUpgrade * After deleting the files, restart. Hope it helps if you get this far.TAGS
* IT
Upvote Upvoted 37
Posted 6 years ago
E-MAIL OPEN RATE TRACKING WITH GOOGLE ANALYTICS' MEASUREMENT PROTOCOL- DEMO
EDIT 4TH FEB 2015 - Google have published an email tracking guide with the Measurement Protocol.
The below goes a bit beyond that showing how to link the user sessionsetc.
The Measurement Protocolwas
launched at the same time as Universal Analytics,but I've
seen less adoption of it with clients, so this post is an attempt to show what can be done with it with a practical example. The demo app is available here: http://ua-post-to-push.appspot.com/ With this demo you should be able to track the following: * You have an email address from an interested customer * You send them an email and they look at it, but don't clickthrough.
* Three days later they open the email again at home, and click through to the offer on your website. * They complete the form on the page and convert. Within GA, you will be able to see for that campaign 2 opens, 1 click/visit and 1 conversion for that user. As with all email open tracking, you are dependent on the user downloading the image, which is why I include the option to upload an image and not just a pixel, as it may be more enticing to allow images in your newsletter.INTRO
The Measurement Protocol lets you track beyond the website, without the need of client-side JavaScript. You construct the URL and when that URL is loaded, you see the hit in your Google Analytics account. That's it. The clever bit is that you can link user sessions together via the CID (Customer ID), so you can track the upcoming Internet of Things off-line to on-line, but also things like email opens and affiliate thank you pages. It also works with things like enhanced e-commerce, so can be used for customer refunds or product impressions. This demo looks at e-mail opens for its example, but its minor modifications to track other things. For instance, I use a similar script to measure in GA when my Raspberry Pi is backing up our home computers via Time Machine.DEMO ON APP ENGINE
To use the Measurement Protocol in production most likely needs server-side code. I'm running a demo on Google App Engine coded in Python, which is pretty readable so should make it fairly easy for a developer to replicate in their favourite language. App Engine is also a good choice if you are wanting to run it in production, since it has a free tier for tracking 1000s of email opens a day, but scalability to handle millions. This code is available on Github here-
http://github.com/MarkEdmondson1234/ga-get-to-post App running that code is here: http://ua-post-to-push.appspot.com/ There are instructions on Github on how it works, but I'll run through some of the key concepts here in this post.WHAT THE CODE DOES
The example has four main URLs: * The homepage explaining the app * The image URL itself, that when loaded creates the hit to GA * A landing page with example custom GA tracking script * An upload image form to change the image you would display in thee-mail.
The URLs above are controlled server side with the code in main.pyHOMEPAGE
This does nothing server side aside serve up the page class MainPage(webapp2.RequestHandler): """Get a URL, POST a GA hit"""def get(self):
template_values = {} template = JINJA_ENVIRONMENT.get_template('main.html') self.response.write(template.render(template_values))view raw
homepage.py
hosted with ❤ by GitHubImage URL
This is the main point of the app - it turns a GET request for the image uploaded into a POST with the parameters found in the URL. It handles the different options and sends the hit to GA as a virtual pageview or event, with a unique user CID and campaign name. An example URL here is: http://your-appengine-id.appspot.com/main.png?cid=blah&p=1&c=email_campaign def getUniqueClientId(seed=''): """Function to create the cid from the parameter passed in. Changethis as needed """
if seed: random.seed(seed) ## make this so its always unique by referring to a set or usingmd5 or something
theID = str(random.randint(1,9999)).zfill(4) + "-" + str(random.randint(1,9999)).zfill(4) + "-" + str(random.randint(1,9999)).zfill(4) + "-" + str(random.randint(1,9999)).zfill(4)return theID
class ImageRequest(blobstore_handlers.BlobstoreDownloadHandler): """The image that is in the email, and has a unique ID attached toit"""
"""This is called when the image is viewed, say in an email oranother website"""
def get(self):
p = cgi.escape(self.request.get('p')) c = cgi.escape(self.request.get('c')) cid = cgi.escape(self.request.get('cid')) nohit = cgi.escape(self.request.get('nohit')) ## if it has the same seed, creates an id like xxxx-xxxx-xxxx-xxxx cid = getUniqueClientId(cid) ## construct the Measurement Protocol call ga_url_stem = "http://www.google-analytics.com/collect"## refer to
https://developers.google.com/analytics/devguides/collection/protocol/v1/devguidevalues = {'v' : 1,
'tid' : 'UA-54019251-3', ## replace with your GA ID'cid' : cid}
if p: ## make a pageview when people see the imagevalues = 'pageview'
values = 'external_email'values = 'email'
values = 'open'
else: ## else make an eventvalues = 'event'
values = 'email'
values = 'open'
if c: ## put campaign info in the pageviewvalues = c
values = '/vpv/email-view/' + c else: ## put campaign info in the event labels values = "campaign_name" values = '/vpv/email-view' ### z is the cache buster values = str(random.randint(1,999999)).zfill(6) if not nohit: ## nohit=1 if you don't want to send hit to GA ### send the hit to Google as a POST data = urllib.urlencode(values) req = urllib2.Request(ga_url_stem, data) response = urllib2.urlopen(req) the_page = response.read() print values ## look in logs to see what was sent ### get the image upload previously done at /upload-image.html andstored in datastore
pixel = Pixel.get_by_id("image")### serve up image
if pixel:
img = blobstore.BlobInfo.get(pixel.img)self.send_blob(img)
else:
self.response.out.write("no image")view raw
imageRequest.py
hosted with ❤ by GitHubLanding Page
This does little but take the cid you put in the email URL, and outputs the CID that will be used in Google Analytics. If this is the same CID as in the image URL and the user clicks in the email, those sessions will be linked. You can also add the GA campaign parameters, but the sever side script ignores those - the javascript on the page will take care of it. An example URL here is: http://your-appengine-id.appspot.com/landing-page?cid=blah&utm_source=source_me&utm_medium=medium_me&utm_campaign=campaign_me class LandingPage(webapp2.RequestHandler): """Example page where content is - utm parameters should be used plus cid which will link the impression and visit """def get(self):
cid = cgi.escape(self.request.get('cid')) clientId = getUniqueClientId(cid)print clientId
template_values = {'clientId' : clientId} template = JINJA_ENVIRONMENT.get_template('landing-page.html') self.response.write(template.render(template_values))view raw
landingpage.py
hosted with ❤ by GitHub The CID in the landing page URL is then captured and turned into an anonymous CID for GA. This is then served up to the Universal Analytics JavaScript on the landing page, shown below. Use the same UA code for both, else it won't work (e.g. UA-123456-1) (function(i,s,o,g,r,a,m){i=r;i=i||function(){ (i.q=i.q||).push(arguments)},i.l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o);a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-XXXXXX-X', 'auto', {'clientId': '{{clientId}}'});
ga('send', 'pageview'); ga('send', 'event','email','visit','email_content')view raw
landingpageJS.js
hosted with ❤ by GitHubUPLOAD IMAGE
This just handles the image uploading and serves the image up via App Engines blobstore. Nothing pertinent to GA here so see the Githubcode if interested.
SUMMARY
Its hoped this helps sell using the Measurement Protocol to more developers, as it offers a solution to a lot of the problems with digital measurement today, such as attribution of users beyond the website. The implementation is reasonably simple, but the power is in what you send and what situations. Hopefully this inspires what you could do with your setup. There are some limitations to be aware of - the CID linking won't stitch sessions together, it just discards a user's old CID if they already had one, so you may want to look at userID or how to customise the CID for users who visit your website first before the email is sent. The best scenario would be if a user is logged in for every session, but this may not be practical. It may be that the value of linking sessions is so advantageous in the future, entire website strategies will be focused on getting users to ID themselves, such asvia social logins.
Always consider privacy: look for user's to opt in, and make sure to use GA filters to take out any PII you may put into GA as a result. Current policy looks to be that if the data within GA is not able to be tracked to an individual (e.g. a name, address or email) then you are able to record an anonymous personal ID, that could be exported and linked to PII outside of GA. This is a bit of a shifting target, but in all cases keeping it as user focused and not profit focused as possible should see you through any ethical questions.TAGS
* cloud
* programming
* analytics
Upvote Upvoted 0
Posted 7 years ago
* 1
* 2
* 3
* Next ›
* Last »
* My Code Blog - R, Python, Data Analytics * What This Blog Is About* @HoloMarkeD
* My LinkedIn
* Google+
* SoundCloud
* My Github
* YouTube
* About Me
MARK EDMONDSON
Views on this blog are entirely my own and do not reflect my employers. Content may also be unbelievably true, or exaggerated falsehoods. Its written for friends, so if I don't know you please act as a potential friend :) Any trolling or spamming will be punishable by death. Contact: mark markedmondson.me Browse the Archive » SUBSCRIBE BY EMAIL » We'll email you when there are new posts here. You're following this blog. Unfollow » Follow this Posthaven » Enter your email address to get email alerts about new posts on this site. Unsubscribe anytime. Email address is invalid.TAGS
* analytics 13
* r 10
* programming 8
* seo 4
* music 3
* listen 2
* denmark 2
* cloud 2
* personal 2
* machine learning 1 * See all 16 tags »Details
Copyright © 2024 ArchiveBay.com. All rights reserved. Terms of Use | Privacy Policy | DMCA | 2021 | Feedback | Advertising | RSS 2.0