This document covers definitions and logic, as well as the calculations behind each data point in RUM.
This document is best paired with RUM that is actively collecting data.
For setup instructions, please see our RUM setup document.
- AJAX Load
- Bounce Rate
- Page Load Time Distribution
- Page Load Time Breakdown
- Page Load Time Trend
- Page Views
RUM Data Points
This section will define the various data points that comprise a RUM report.
Time taken to make and process AJAX calls. AJAX is used as a method to asynchronously load and process data, which helps websites make changes to a page in real time. Some examples include; a live timer counting down to a sale, an item appearing inside a shopping cart, or refreshing a specific value or set of values every X seconds. See this document for more information on the AJAX load time breakdown.
No assets are loaded during these calls, and so typically AJAX load time values are well below those in the page load time breakdown. Uptime.com recommends a 1000 ms AJAX load time as the maximum “acceptable” range.
Apdex stands for Application performance index, which is an open standard developed to measure the performance of software applications. It is designed to convert raw measurements into more specific insights about user satisfaction with the state of your applications.
Uptime.com RUM uses a customizable Apdex threshold to determine a range of “satisfied” performance. These levels correspond to color-coded reporting and help to visualize the user experience.
Satisfied: default represents users experiencing load times within 0-4 seconds. This is the default threshold set when creating RUM. The user is considered to be Satisfied if the response time is less than or equal to the threshold value.
Tolerating: default represents users experiencing load times within 4-16 seconds and indicates there may be performance issues. If Apdex threshold has been manually adjusted, the calculation is as follows: The user is considered to be tolerating performance if response time is less than or equal to 4 times the threshold defined.
Frustrated: Default indicates users experiencing load times over 16 seconds, definite performance issues. If Apdex threshold has been manually adjusted, the calculation is as follows: The user is considered to be frustrated by performance if response time is greater than 4 times the threshold defined.
The bounce rate of RUM is calculated as follows:
(Number of bounced sessions * 100)
Bounce Rate Vs Session
This graph compares the bounce rate of your selected date range (expressed as a percentage) to the number of sessions for the selected period. Bounce rate can be plotted higher than the number of sessions if the number of sessions is excessively low, or bounce rate % is excessively high.
Bounce Rate Vs Load time
This graph compares the bounce rate of your selected date range (expressed as a percentage) to the page load time for the selected period. This graph is intended to help correlate slow performance with a high bounce rate, and should be investigated as part of your assessment on the impact of your site’s performance.
Bounce Rate Vs Errors
This graph compares the bounce rate of your selected date range (expressed as a percentage) to the percent of errors per 4XX, 5XX, and JS errors. High occurrences of errors can lead directly to users bouncing from the page, and this graph is intended to assist investigations into errors and availability.
Bounce Rate by segment
This section offers direct insights into bounce rate broken down by individual URL or URL group. This section includes metrics for both sessions and Time to Interactive, as well as a comparison to the previous baseline of data.
Bounce rate is not calculated immediately so it will almost always show as 0% when filtering by the previous 30 minutes. We highly suggest adjusting the date range of your report to 1 hour or longer.
The Error tab is where you discover if there is a problem with your site, and help to zero in on the problem URLs or URL group(s).
Error rate is calculated as follows:
(Number of page views with errors * 100)
Total page views
Error rate should not be taken as an indicator of a website’s availability, and users should always utilize both HTTP(S) and Transaction checks as applicable for a true indication of uptime. Error rate could apply to third party applications that fail to load, or an expected failure (such as a 401 error on an access restricted page).
Errors coded with the first number of 4 indicate a client-side error, such as unauthorized access or resources not found. Main page loads are marked 4xx if there is an AJAX call that runs during page load that returns a 4xx status code.
A certain number of 4xx errors may be expected (such as unauthorized access).
Errors coded with the number of 5 indicate a server-side error, such as a bad gateway or service unavailable. Typically, 5xx errors signal a URL or URL group is unreachable or inaccessible and any rise in 5xx errors should be investigated thoroughly. Main page loads are marked 5xx if there is an AJAX call that runs during page load that returns a 5xx status code.
Tip: URLs that see 5xx errors are good candidates for HTTP(S) checks, if they are critical to transaction pathways.
This graph displays the distribution of users experiencing load times that correspond to the Apdex threshold for Satisfied, Tolerating, and Frustrated. Apdex threshold is customizable during check setup, and when you edit your RUM check. The X axis represents the range of load times, while the Y axis represents the number of pageviews.
This histogram does not use aggregation as it is a visualization of all possible aggregations. Therefore, the median load time appears at the center of the graph, while the 99th percentile would be close to the right-hand side of the graph.
Page Load Time Breakdown details communication times between endpoints. The chart breaks down load time segments between connection, server, transfer, client, and assets as they relate to the current period, and the baseline (the default value uses the median of the previous week’s combined pageviews and load times, but this is adjustable at the reports page).
The data is displayed in milliseconds for the current period and in milliseconds with UP/Down indicators showing the percent increase or decrease from the past week’s baseline.
Each data type is defined below in greater detail:
Connection represents the time it takes to perform a DNS lookup, establish the connection and send a web page request. The end user has sent an initial request to your site and connected during this phase.
Server represents the time it takes the server to process the request and initiate a response. The user is still not seeing anything on the site, but this metric represents server response time.
Transfer represents the time it takes for the browser to download the web page's HTML.
Time taken to load any additional non-critical page assets such as images and videos.
The baseline for a given data point is the period from the present moment to the previous 7 days’ (168 hours) worth of data. Only error rate and Apdex are compared to the baseline period.
Time periods that occur outside of this range of data will not be compared to a baseline value. For example, setting the date range to the past 30 days would fall outside of the baseline and will not have a baseline value comparison. The same is true for the past 7 days, as baseline begins from the present moment and the previous 7 days of data begins at midnight of the first day. It is also important to note, not all parameters are aggregated using the average or median. Pageviews, AJAX calls, and Sessions are all summed.
Page Load Time Trend includes several data points that represent page load in terms of when the user is first able to see something, and when the page is fully interactive. The left hand side Y axis of this graph represents page views, while the right hand side y axis represents the range of load times. The X axis shows the date range selected.
This graph includes the most detail for page load, and is useful as an indicator of the actual user experience. From request to fully interactive, this graph will show how load times, page views, and timeframe are correlated.
The data points used in this graph are defined in further detail below.
Time lapsed between a browser request and the first byte received by the server. This metric is a strong indicator of server performance, similar to the performance metrics reported on by an HTTP(S) check but based on real user sessions.
The time it takes the user’s browser to render the first pixels distinctly different from the previous point of navigation. This metric is a measure of the time lapsed between the user sending the request to your server (AKA typing in your URL and starting the navigation process) to receiving the first byte and seeing the first visual signs of your website.
Time elapsed before a page becomes fully interactive and usable. The goal of TTI is to provide a measurement that reflects when a user is able to fully interact with a website. As a result, this metric takes into account several events occurring:
- AJAX requests that request data for a page asynchronously. We track these events to allow for the time it takes to download all the necessary data.
- FirstContentfulPaint (FCP) - This event is fired when the site’s content first appears on the screen for a user. We track this event to ensure that all data has been successfully loaded, and that blocked CSS is not preventing the user from interacting with the page.
Computing TTI involves adding the maximum timestamp from each of the above data points. The gap between TTI and First Paint can indicate areas for improvement that will reduce performance declines.
Page views exclude any AJAX calls. Page views are an important metric for comparing against other key performance indicators, such as Time To Interactive, bounce rate, error rate, and load time breakdowns. It is possible to add additional RUM page views to your account, which can increase the frequency of capturing a RUM session. In general, one RUM visit is counted every 30 seconds.
All data aggregates are medians, not averages with the exception of Page views which is the SUM compared to the average number of page views during 30 minute intervals from the past week’s data points.
The SUM is always compared to the time range adjusted value; i.e. if you are viewing a report for 1 hour period - then we compare the page views to the 1 hour average on the past weeks data points. For periods > 1 week a baseline is not provided.
A session is a continuous set of user interactions within your application, by a unique user. Uptime.com ends a user session after 30 minutes of inactivity or 24 hours after the session started, whichever comes first. If a user returns after 30 minutes of inactivity, a new session will be recorded.