A detailed Frontend comparison of e-commerce applications and ways to improve further.

May 2, 2024 |
Views: 53 |

Reading Time:

In the realm of e-commerce, the frontend plays a pivotal role in shaping user experience and driving conversions. A well-designed and optimized frontend can significantly enhance user engagement, satisfaction, and ultimately, sales. This article delves into a comprehensive comparison of the frontend performance of ten prominent e-commerce applications: Apex, Transcom, Bata, Puma, Calvin Klein, Adidas, Chaldal, Aarong, Daraz, and Asics.

The evaluation encompasses four key criteria which are:

  • Lighthouse Performance metrics for desktop and mobile devices
  • Chrome UX Report Performance metrics for desktop and mobile devices, reflecting real-world user experiences
  • CPU Time for desktop and mobile devices
  • Potential savings in CPU Time for desktop and mobile devices

Furthermore, we will also look at the additional methodologies necessary to accomplish this feat. Finally, we will try to sort out what worked for the leading applications and how the others could have improved for the better. So without further ado Join us on this journey as we unravel the intricacies of frontend development in the ecommerce landscape and uncover strategies to elevate the user experience further. From loading times to user interaction, we leave no stone unturned in our pursuit of understanding and improving the frontend dynamics that shape the success of e-commerce applications.

Assessment criteria

As previously mentioned, the assessment considers four criteria. Before we delve into the results and in-detail analysis it is imperative that we are on the same page regarding the assessment criteria. Consider it as a 101 course you are taking to know your tools ahead.

  1. Lighthouse Performance Metrics:
    • Purpose: Evaluates web page quality.
    • Devices: Desktop and mobile.
    • Metrics: Page load times, rendering speed, responsiveness.
  2. Chrome UX Report Performance Metrics:
    • Purpose: Reflects real user experiences.
    • Devices: Desktop and mobile.
    • Metrics: First contentful paint, time to interactive, user-centric indicators.
  3. CPU Time:
    • Purpose: Measures CPU processing time.
    • Devices: Desktop and mobile.
    • Significance: Lower CPU time indicates efficiency and smoother user experience.
  4. Potential Savings in CPU Time:
    • Purpose: Identifies optimization opportunities.
    • Devices: Desktop and mobile.
    • Approach: Examines areas for code and resource improvements to reduce device load.

Additional methodologies

Now that we know the basis on which we are to evaluate the frontend environments, let us now explore the additional methodologies involved. There are 5 steps to be precise, which are as follows:

  1. Tool Selection: The primary tool used for these tests is PageSpeed Compare, an online application that provides performance comparisons. This app utilizes several other tools to gather data, including Chrome’s Lighthouse tool and Chrome’s CrUX Dashboard v2. These tools are known for their comprehensive performance auditing capabilities.
  2. Page Selection: The pages selected for testing include the application’s homepage, a randomly selected category or list page, and a randomly selected product details page. These pages are often the most visited and therefore provide a representative sample of the application’s overall performance.
  3. Testing: Each selected page is tested using the aforementioned tools. The tests measure various performance metrics, providing a detailed overview of the page’s speed and efficiency.
  4. Comparison: The results from each page are then compared to provide a comprehensive view of the application’s performance. This comparison can help identify areas of strength and weakness in the application’s design and implementation.
  5. Substitution: If any of the selected pages (homepage, category/list page, product details page) are missing from an application, they will be replaced with a different application for that test. This ensures that the comparison is fair and that all applications are tested on the same parameters.

This methodology provides a robust and comprehensive approach to auditing the performance of web applications. It ensures that all key aspects of an application’s performance are evaluated and compared, providing valuable insights for developers and stakeholders.

Drumrolls – The Results

Enough with the nerdy details! Let us now jump straight to the results and the improvement strategies. We will distribute the findings into three categories based on the webpages involved, which are-

  • Homepage
  • Product details
  • Product category/List page

Result summary: Homepage

1. Lighthouse Performance Metrics:

    • Desktop Winner: Apex (77)
    • Mobile Winner: Apex (41)
    • Insights: Apex excels in both desktop and mobile Lighthouse Performance metrics, indicating superior page load times, rendering speed, and overall responsiveness.
Homepage – Lighthouse Performance metrics – Desktop

Homepage – Lighthouse Performance metrics – Mobile

2. Chrome UX Report Performance Metrics:

    • Desktop Winner: Daraz (85)
    • Mobile Winner: Daraz (90)
    • Runners-up: Apex (75) and Aarong (49)
    • Insights: Despite minor limitations, Apex closely trails Daraz in real-world user experiences on desktop metrics. Daraz, however, outshines its competitors on mobile by optimizing homepage architecture and reducing the number of DOM elements for faster loading. Aarong also performs well but falls slightly behind Apex.
Homepage – Chrome UX Report Performance metrics – Desktop
Homepage – Chrome UX Report Performance metrics – Mobile

3. CPU Time:

    • Desktop Winner: Transcom / Pickaboo (3.2s)
    • Mobile Winner: Pickaboo (10s)
    • Runners-up: Apex (3.5s), Transcom (11.1s)
    • Insights: While Transcom and Pickaboo seem to have an edge in CPU time efficiency for desktop, and Pickaboo dominates in mobile, Apex holds its own in both categories lagging some milliseconds behind. This indicates that Apex has room for improvement, but is still a strong contender in optimizing CPU usage.
Homepage – CPU Time – Desktop
Homepage – CPU Time – Mobile

Improvement Strategies:

  1. Leverage Daraz’s Optimized Architecture:
    • Desktop: Companies like Apex can learn from Daraz’s architecture, implementing strategies such as paging, smaller image sizes, and overall optimization to improve Lighthouse and Chrome UX Report metrics.
    • Mobile: Given Daraz’s significant lead on mobile, reevaluating and streamlining the architecture, introducing infinite loading, and reducing unnecessary elements could enhance the user experience.
  2. Optimize Image Load Times:
    • Desktop: Apex should focus on optimizing large images and swipers to improve CPU time and overall performance.
    • Mobile: Addressing large, unoptimized images is crucial for Apex to compete with Daraz on mobile. Image compression and efficient rendering can significantly enhance mobile performance.
  3. Implement Infinite Loading for Mobile:
    • Mobile: Despite having slightly lower mobile scores compared to Daraz, Apex can still improve its mobile performance and user experience by introducing infinite loading for products. This approach reduces the number of DOM elements, which can lead to faster loading times and a smoother browsing experience. By prioritizing such initiatives, Apex can overcome its limitations and provide a better experience to its mobile users.
  4. UI Adjustments for Mobile Efficiency:
    • Mobile: If feasible, it would be better to reevaluate and simplify the UI on mobile, akin to Daraz’s approach. Reducing the number of sections and elements can lead to faster load times and improved Chrome UX Report metrics.

Result summary: product details page

  1. Lighthouse Performance Metrics:
    • Desktop Winner: Apex (82)
    • Mobile Winner: Daraz (56)
    • Bottom Performer: Apex (31)
    • Insights: Despite facing some challenges on the mobile platform, Apex performs exceptionally well in desktop performance metrics, particularly in Lighthouse scores, and ranks among the top performers. Although it falls behind Daraz on mobile, Apex remains a noteworthy contender in the overall rankings.
  2. CPU Time:
    • Desktop Winner: Transcom (1.6s)
    • Runners-up: Adidas (1.9s), Apex (2.3s)
    • Insights: While Transcom and Adidas are leading the pack in CPU efficiency on desktop, it’s worth noting that Apex has taken the third spot. This indicates a strong potential for improvement in CPU time optimization for Apex.
  3. CPU Time – Mobile:
    • Winner: Daraz (4s)
    • Bottom Performer: Apex (14.8s)
    • Insights: Daraz excels in mobile CPU time efficiency, while Apex lags behind significantly. There’s a substantial room for improvement in optimizing CPU usage for Apex on mobile.
  4. Chrome UX Report Performance Metrics:
    • Desktop and Mobile: Test failed, not enough data.
    • Insights: Unfortunately, there’s insufficient data for Chrome UX Report metrics on both desktop and mobile. This lack of information limits a comprehensive evaluation of the real-world user experience on these platforms.
Product Details – Lighthouse Performance metrics – Desktop
Product Details – Lighthouse Performance metrics – Mobile

Improvement Strategies:

  1. Mobile Performance Enhancement:
    • Lighthouse Metrics: Overall, Apex and Daraz are at the frontiers. Furthermore, Apex can dominate by addressing mobile performance issues to compete effectively and priorotizing optimizations that improve mobile page load times and overall responsiveness.
  2. CPU Time Optimization:
    • Desktop and Mobile: Implement strategies to enhance CPU time efficiency, especially on mobile. Learning from Transcom’s and Adidas’s desktop performance could guide Apex in refining its CPU usage.
  3. Investigate Chrome UX Report Failures:
    • Desktop and Mobile: Resolve the issues causing the failure in Chrome UX Report tests. Gaining insights from real-world user experiences is crucial for identifying areas of improvement in user satisfaction.

Result summary: product category page

  1. Lighthouse Performance Metrics:
    • Desktop Winner: Apex (73)
    • Mobile Winner: Pickaboo (56)
    • Runners-up: Apex, Adidas, Daraz (30)
    • Insights: Apex leads in desktop Lighthouse scores, while Pickaboo takes the lead on mobile. However, Apex faces some competition on mobile, ranking alongside Adidas and Daraz.
  2. Chrome UX Report Performance Metrics:
    • Desktop Winners: Daraz (85), Apex (83)
    • Mobile Winner: Aarong (69)
    • Bottom Performer: Apex (51)
    • Insights: Daraz and Apex excel in desktop Chrome UX Report metrics, but Aarong takes the lead on mobile. Apex falls behind on mobile, indicating potential areas for improvement in user-centric experiences.
  3. CPU Time:
    • Desktop Winner: Pickaboo (1.6s)
    • Third: Apex (3.2s)
    • Insights: Pickaboo demonstrates exceptional CPU time efficiency on desktop, while Apex holds a respectable position. However, there is room for improvement to close the gap between the two.
  4. CPU Time – Mobile:
    • Winner: Daraz (6.7s)
    • Second: Transcom (11.1s)
    • Fifth: Apex (13s)
    • Insights: Daraz leads in mobile CPU efficiency, with Apex trailing in the fifth position. Addressing mobile CPU time is crucial for Apex to enhance overall performance.
Product Category – Lighthouse Performance metrics – Desktop
Product Category – Lighthouse Performance metrics – Mobile

Improvement Strategies:

  1. Mobile Lighthouse Optimization:
    • Lighthouse Metrics: Given the competition on mobile, Apex should focus on optimizing mobile page load times and responsiveness. Learning from Pickaboo’s success could provide valuable insights.
  2. Enhance Mobile Chrome UX:
    • Chrome UX Report Metrics: To compete with Aarong on mobile, Apex should analyze and improve user-centric metrics. Identify and address elements impacting user experiences, aiming for a higher Chrome UX Report score.
  3. Close the Gap in Mobile CPU Time:
    • CPU Time – Mobile: Analyze and optimize CPU usage on mobile devices. Learning from Daraz and Pickaboo can guide Apex in improving efficiency and reducing mobile CPU time.
  4. Consolidate Desktop Strengths:
    • Desktop Metrics: While Apex leads in desktop Lighthouse scores, maintaining and improving this position should be a focus. Address any areas where competitors are closing the gap to ensure a consistently strong desktop performance.

Final thoughts

In the world of ecommerce, Apex and Daraz are platforms that stand out, despite facing some mobile challenges. In-depth frontend assessments have revealed that Apex consistently excels in desktop performance, earning impressive scores in Lighthouse and Chrome UX Report tests. Although its mobile responsiveness and CPU efficiency may not be as strong as other platforms like Daraz and Pickaboo, Apex has proven to be a desktop performance champion. As competition in the ecommerce realm intensifies, Apex can draw insights from top performers like Daraz and Pickaboo to bridge the gaps in mobile responsiveness and CPU efficiency. By striking a balance between leveraging desktop strengths and implementing targeted mobile enhancements, Apex can deliver seamless and efficient user experiences, ultimately defining its success in the highly competitive digital marketplace.
From cloud computing to edge computing: Navigating the evolving landscape of IT infrastructure!

From cloud computing to edge computing: Navigating the evolving landscape of IT infrastructure!

Cloud computing has been at the forefront among technologies that shaped the IT landscape. Its ability to provide on-demand virtual resources, scalability, and ease of access made it highly accepted in the IT infrastructure. As a cost-effective yet efficient approach compared to traditional bulky resource space, organisations were quick to adapt to cloud-based technologies such as storage, computational power, and so on. Undeniably, the introduction of cloud computing to the IT genre was a game changer.

read more
Beyond delivery: 5 ways we go the extra mile for our clients’ success.

Beyond delivery: 5 ways we go the extra mile for our clients’ success.

In the vibrant tech tapestry of Bangladesh, Astha IT isn’t just another thread holding pixels in place. We’re the intricate patterns, the bold colors, the narrative woven into the very fabric of your digital dreams. Sure, we deliver projects on time and on budget – that’s the warp and weft of any good tech story. But what truly sets us apart is the needlepoint precision with which we go the extra mile, ensuring your success shines through every pixel. We’re not just code slingers, we’re your tech co-conspirators, your digital cheerleaders, your unwavering champions on the road to digital transformation.

read more
How to find a product market fit for your mobile app using an MVP?

How to find a product market fit for your mobile app using an MVP?

Are you aware that almost 70% of startups can go wrong and fail before it even reaches its final form? There are numerous reasons for this and the most viable one is yet to be determined. So how does an entrepreneur prevent his/her innovative startup from biting the dust?

However, we do not want you to take our word for it. In this article, we are going to discuss what a competitor analysis exactly is and what questions it will help you answer, so you can make an informed decision for yourself. (Trust me though, you will not want to miss it.)

read more
SHARE ON SOCIAL MEDIA