現代網絡分析師所扮演的角色的四件事
作為網站分析師,你需要做以下四件事:
三個階段
我們可以將現代網站分析師的工作分為三個階段:
第一階段:數據收集
假設在線業務大多數都是在你公司官網上進行,那麼大部分線上營銷、搜索營銷和用戶行為活動的相關數據,都可以通過:
網頁日誌文件收集 — 你必須精通網頁日誌文件收集數據的原理,並且知道哪些數據是可收集的。網站日誌文件可以"記錄"所有用戶在網站上加載的文件,因此你可以輕易地發現網頁的哪些"部分"沒有響應用戶的請求。網站日誌分析參考:網站日誌分析。
網站分析 — 全球大部分網站都使用分析工具。網站分析工具一般具有圖形界面,可以快速顯示用戶的數據趨勢。所有數據可以以表格、文本文件甚至是 PDF 文件的形式下載到本地。
利用網站分析工具收集用戶數據前,需要安裝基礎設置來追踪數據。通常要插入一些JavaScript的追踪腳本或者在網站所有HTML 頁面插入一些1*1像素的腳本。如果你需要收集的用戶數據超出默認設置所收集的用戶數據,需在常規追踪腳本外安裝高級追踪腳本。
這些包括:
搜索營銷平台 — 有些網站利用第三方搜索營銷管理和跟踪平台,比如Kenshoo, Marin Software 或Adobe AdLens(以前叫Efficient Frontier)。這種情況下,需要在網站上安裝該平台的跟踪腳本,才能收集信息。
第二階段:數據提取
數據收集好後則要提取信息。第一階段收集的原始數據需要轉換成報告,主要有兩種形式:
商業分析(BI)團隊可能已經處理過原始數據並把數據轉化成可讀報表。一些商業分析數據倉庫系統可以製作這類報表,比如Cognos或其他相似的數據平台。通常公司都是用這些來做常規報表。即時報表則需要更高的處理時效。
要想從MYSQL數據庫、Oracle數據庫或者其他數據庫直接提取數據,你需要用到SQL語句。並非收集的所有原始數據都需要進入到數據庫或者數據倉庫。因此你要花很多時間抉擇到底要處理哪些數據。比如:
第三階段:數據處理及展示
Excel:在製作漂亮數據圖表報告之前,你需要對SQL 語句提取的數據做一些處理。通過Excel 做漂亮報表相當耗時,因此,熟練並專業地使用Excel 各種功能和公式能夠極大地提高工作效率。例如利用透視表整合數據、利用Vlookup函數合併數據和利用“月”、“日”等公式處理日期等。
注意:
這篇文章的中文版以前在WAW (網站分析星期三)的微信公眾號發布過的。
中文版和英文版,這次我都一起放出來。以上是中文,以下是英文。
我寫的英文原版:
The Role of the Modern Day Web Analyst
As a web analyst, your role requires you to perform four major tasks:
Let's go through the three phases for the modern day web analyst:
Phase 1: Data Capture
Assuming your online business mostly happens on your company's websites, most of the online marketing, search marketing and user behavior activities can be captured with:
Web Log Files – You must be very familiar with the principles of how web log files capture data and what data is available. Web log files give you the ability to "record" all the files that were loaded by the user when they accessed your websites, and you can easily see which "components" of your websites aren't responding to user requests.
Web Analytics – Most websites globally use some analytics tools. A typical web analytics tool provides graphical user interfaces (GUI) and allows you to quickly see the data trends of your users. Reports can be downloaded as spreadsheets, text files or sometimes even as PDF files.
Before a web analytics tool can capture users' data, you are responsible for implementing the required tracking setup. This normally requires inserting some JavaScript tracking scripts or some 1×1 pixel scripts onto all HTML pages of your websites. If the objectives include capturing more than the default amount of user data, then you are required to implement some advanced tracking scripts on top of the regular tracking scripts.
These include:
Search Marketing Platforms – Some websites make use of third-party search marketing management and tracking platforms such as Kenshoo, Marin Software, or Adobe AdLens (formerly called Efficient Frontier). You are required to implement the platform's tracking scripts onto your websites in order for the data capture to work.
Phase 2: Data Extraction
Once the data is collected, the next phase is to extract the data for the end users. Raw data collected in phase #1 should be converted into reports that are for two major purposes:
The business intelligence (BI) team may already have processed the raw data and have it converted into reports which are readable. The reports can be obtained under some BI data warehousing systems, for example Cognos or other similar data cubes. These reports can form a large part of the regular reports to a company. Ad hoc reports often require quicker turnaround time.
It is time for you to utilize your SQL query ability in order to extract data directly from the databases whether they are MYSQL databases, Oracle databases, or other databases. Not all the raw data you have captured goes straight into your databases or data warehouses. For this reason, a large of amount of time goes into then working on post data manipulation. For example:
Phase 3: Data Manipulation & Presentation
Excel – Before you can present pretty graphical reports, the data extracted through ad hoc SQL queries will need to be manipulated. Creating pretty reports through Excel can take a lot of time, therefore, having expert knowledge on how to get the most out of Excel features and formulas will only improve efficiency. These could include: aggregating data with Pivot tables, merging data with Vlookup, manipulating dates with functions like "day", "month", etc.
關注香港SEO教學 - 學習SEO網站優化,SEO教學:200+ Google SEO 排名規則
作為網站分析師,你需要做以下四件事:
- 趨勢報告及數據報告
- 分析當前線上營銷獲取策略以及探索新機會和/或新策略
- 理解網站用戶的行為和經歷
- 持續關注趨勢和細節
三個階段
我們可以將現代網站分析師的工作分為三個階段:
- 數據收集
- 數據提取
- 數據處理及展示
第一階段:數據收集
假設在線業務大多數都是在你公司官網上進行,那麼大部分線上營銷、搜索營銷和用戶行為活動的相關數據,都可以通過:
網頁日誌文件收集 — 你必須精通網頁日誌文件收集數據的原理,並且知道哪些數據是可收集的。網站日誌文件可以"記錄"所有用戶在網站上加載的文件,因此你可以輕易地發現網頁的哪些"部分"沒有響應用戶的請求。網站日誌分析參考:網站日誌分析。
網站分析 — 全球大部分網站都使用分析工具。網站分析工具一般具有圖形界面,可以快速顯示用戶的數據趨勢。所有數據可以以表格、文本文件甚至是 PDF 文件的形式下載到本地。
利用網站分析工具收集用戶數據前,需要安裝基礎設置來追踪數據。通常要插入一些JavaScript的追踪腳本或者在網站所有HTML 頁面插入一些1*1像素的腳本。如果你需要收集的用戶數據超出默認設置所收集的用戶數據,需在常規追踪腳本外安裝高級追踪腳本。
這些包括:
- 國外的免費工具:谷歌分析
- 國內的免費工具:百度統計、CNZZ 統計、騰訊分析
- 國外的付費工具:Adobe Site Catalyst
- 國內的付費工具:99Click
搜索營銷平台 — 有些網站利用第三方搜索營銷管理和跟踪平台,比如Kenshoo, Marin Software 或Adobe AdLens(以前叫Efficient Frontier)。這種情況下,需要在網站上安裝該平台的跟踪腳本,才能收集信息。
第二階段:數據提取
數據收集好後則要提取信息。第一階段收集的原始數據需要轉換成報告,主要有兩種形式:
- 常規數據報告:這些報告通常有一定的時間規律,按照報告性質可按每天、每周和每月生成。報告根據收取人的要求分成不同等 級,比如執行官可能跟公司主要部門開會時需要展示關鍵收入的報表。運營經理可能需要中級數據報告去追踪他們組負責的產品 的"潛在問題"。
- 即時數據報告:這些報告並非常規需要。通常即時數據報告是為了回顧所有一次性線上活動的效果。如需深入了解數據所代表的 問題,比如為什麼過去兩週KPI 數字有所下降,這時也可用到即時數據報告。
商業分析(BI)團隊可能已經處理過原始數據並把數據轉化成可讀報表。一些商業分析數據倉庫系統可以製作這類報表,比如Cognos或其他相似的數據平台。通常公司都是用這些來做常規報表。即時報表則需要更高的處理時效。
要想從MYSQL數據庫、Oracle數據庫或者其他數據庫直接提取數據,你需要用到SQL語句。並非收集的所有原始數據都需要進入到數據庫或者數據倉庫。因此你要花很多時間抉擇到底要處理哪些數據。比如:
- 如果你選擇了一個免費的網站分析工具,比如Google Analytics,那麼網站數據則全部存放在谷歌的服務器上。你可以在GA後台 上選擇以表格形式把數據報告下載到本地,或者通過谷歌提供的API 直接提取數據報表。
- 搜索營銷平台的數據有可能也存放在供應商的服務器上,此時你只能以表格的形式提取數據。
第三階段:數據處理及展示
Excel:在製作漂亮數據圖表報告之前,你需要對SQL 語句提取的數據做一些處理。通過Excel 做漂亮報表相當耗時,因此,熟練並專業地使用Excel 各種功能和公式能夠極大地提高工作效率。例如利用透視表整合數據、利用Vlookup函數合併數據和利用“月”、“日”等公式處理日期等。
注意:
這篇文章的中文版以前在WAW (網站分析星期三)的微信公眾號發布過的。
- 中文版的翻譯者是Lucky Chen。
- 英文原版是我寫的,以前在 Clickz.com上發布過。
中文版和英文版,這次我都一起放出來。以上是中文,以下是英文。
我寫的英文原版:
The Role of the Modern Day Web Analyst
As a web analyst, your role requires you to perform four major tasks:
- Trend and data reporting
- Analyzing current online marketing acquisition strategies and exploring new opportunities and/or new strategies
- Understanding on-site visitor behavior and experiences
- Staying connected with the trends and the details
Let's go through the three phases for the modern day web analyst:
- Data capture
- Data extraction
- Data manipulation and presentation
Phase 1: Data Capture
Assuming your online business mostly happens on your company's websites, most of the online marketing, search marketing and user behavior activities can be captured with:
Web Log Files – You must be very familiar with the principles of how web log files capture data and what data is available. Web log files give you the ability to "record" all the files that were loaded by the user when they accessed your websites, and you can easily see which "components" of your websites aren't responding to user requests.
Web Analytics – Most websites globally use some analytics tools. A typical web analytics tool provides graphical user interfaces (GUI) and allows you to quickly see the data trends of your users. Reports can be downloaded as spreadsheets, text files or sometimes even as PDF files.
Before a web analytics tool can capture users' data, you are responsible for implementing the required tracking setup. This normally requires inserting some JavaScript tracking scripts or some 1×1 pixel scripts onto all HTML pages of your websites. If the objectives include capturing more than the default amount of user data, then you are required to implement some advanced tracking scripts on top of the regular tracking scripts.
These include:
- Free & global: Google Analytics
- Free & local (in China): Baidu Tongji, CNZZ Tongji, Tencent Analytics
- Paid & global: Adobe Site Catalyst
- Paid & local (in China): 99Click
Search Marketing Platforms – Some websites make use of third-party search marketing management and tracking platforms such as Kenshoo, Marin Software, or Adobe AdLens (formerly called Efficient Frontier). You are required to implement the platform's tracking scripts onto your websites in order for the data capture to work.
Phase 2: Data Extraction
Once the data is collected, the next phase is to extract the data for the end users. Raw data collected in phase #1 should be converted into reports that are for two major purposes:
- Regular Data Reports: These reports need to be received on a regular basis, whether that's once per day, per week, per month, depending on the report. These reports are categorized into different levels depending on the receiver, so an executive would need high-level reports showing key revenue numbers for each major division of the company. Operational managers would be looking at mid-level data reports that allow them to track "potential problems" in the products that team is responsible for.
- Ad Hoc Data Reports: These reports won't be processed regularly with any fixed intervals. Normally, ad hoc reports are required for review purposes for any once-off online campaigns. Ad hoc reports are also required when you need to dive deep into the data in order to figure out problems such as why certain KPI numbers have decreased over the past two weeks.
The business intelligence (BI) team may already have processed the raw data and have it converted into reports which are readable. The reports can be obtained under some BI data warehousing systems, for example Cognos or other similar data cubes. These reports can form a large part of the regular reports to a company. Ad hoc reports often require quicker turnaround time.
It is time for you to utilize your SQL query ability in order to extract data directly from the databases whether they are MYSQL databases, Oracle databases, or other databases. Not all the raw data you have captured goes straight into your databases or data warehouses. For this reason, a large of amount of time goes into then working on post data manipulation. For example:
- If your choice of web analytics is a free tool, such as Google Analytics, then your web data is all sitting on Google's servers. Your options are to either download the data reports onto spreadsheet formats through the Google Analytics online interface, or extract the data reports through Google's APIs.
- The data from the search marketing platforms may be sitting on your vendor's servers, and you can only extract the data into some spreadsheet formats.
Phase 3: Data Manipulation & Presentation
Excel – Before you can present pretty graphical reports, the data extracted through ad hoc SQL queries will need to be manipulated. Creating pretty reports through Excel can take a lot of time, therefore, having expert knowledge on how to get the most out of Excel features and formulas will only improve efficiency. These could include: aggregating data with Pivot tables, merging data with Vlookup, manipulating dates with functions like "day", "month", etc.
關注香港SEO教學 - 學習SEO網站優化,SEO教學:200+ Google SEO 排名規則
請按此登錄後留言。未成為會員? 立即註冊