This article examines why utilising relevant data and a testing structure is important for an organisation.
To begin, we base our consulting on data available by clients and the insights we can gain from it. The internet has enabled a much wider amount of data collection more accessible than ever before. While collecting thousands of different data points can be useful, often data overload can be much less useful. In this article, we try to make a simple example of a topic that has been vastly overcomplicated in recent years.
At a base level, your organisation should be collecting web click-stream data through an application such as Google Analytics. This will tell you what pages people look at on your website, how long they look at them for and where they came from. Further data collection programs might collect qualitative data such as surveys of web visitors, or data on where visitors click or scroll. There are even web applications that allow you to record visitor browser sessions to analyse their behaviour.
With so much data available, it is easy to get lost in it. To pick out the low hanging fruit, it helps to have an understanding of where users usually get lost/don’t follow what we want them to do.
A good place to start is your landing page, or the first page that a user sees when visiting your website. A large proportion of your landing page traffic will be your home page, followed by specific pages setup for campaigns. You can use a tool such as Google Analytics to see where users go from your home-page, and how this compares to where you expect them to go. You can adjust your calls to action and content to provide them with more useful content they are looking for. This process formally is called Conversion Rate Optimisation, and involves completing statistical experiments with one version of a content VS another (A/B Testing).
If you’re running an E-Commerce website, your checkout and the process of going through the checkout is a major opportunity for improvement in most cases. This includes usability, and the presence of trust and social proof indicators on your website.
To setup a testing schedule and identify improvements, complete an audit of your website to identify potential tests and improvements in priority. Identify the highest value improvements and test for them first. A/B tests usually take around 100-200 conversions to become statistically significant for your control (current content) vs the experimental content. Depending on your website traffic, you can come up with a 6-12 month A/B testing schedule and have a regular result to review in your team meetings. Walter Analytics can help you to setup such a structure and recommend software for further usability testing/feedback.
So, to answer the question “Why is data and testing important?”, we believe that the answer lies in being at the top of the game rather than just being amongst the pack of your competitors. By constantly pushing for higher success and engagement, there is an available process to validate ideas and use the feedback gained to get better. Case studies of A/B testing on larger scales have seen large shifts in performance and conversion through changing very small factors that might otherwise be delegated to decision by your design agency who doesn’t have a vested interest in the ROI of your website.