TECHNICAL AUDIT EXAMPLE
We conduct technical audits to provide a comprehensive evaluation of various platforms’ performance, security, and efficiency. This example showcases how our analysis can uncover areas for improvement and offer actionable solutions to enhance your system.
Build performance
Separating the UI and backend has significantly enhanced performance. The average UI build time is 2 minutes and 55 seconds, which is impressive and enables quick rollouts of new features and critical fixes.
Now, let’s explore the essentials of an effective engagement model. Feel free to use it as a checklist when choosing your software development partner.
Runtime performance
At first glance, the metrics show us a good result, but there are also areas where there is room for improvement:
- FCP - First Contentful Paint measures how long the browser renders the first piece of DOM content after a user navigates to your page. The yellow field means that it expects improvement. In most cases, the size of the project influences this metric. So, appropriate code refactoring, clean legacy, and unused code should improve value.
- LCP - Largest Contentful Paint measures when the largest content element in the viewport is rendered to the screen. In terms of diagnostics, the reason that affects this value is load delay (the Delay between TTFB and loading of the largest content). For optimization of this value, the browser must detect this image earlier.
- CLS - Cumulative Layout Shift measures the largest burst of layout shift scores for every unexpected layout shift that occurs during a page’s entire lifespan. The current value is concerningly high, yet no shifts are visually detectable. Diagnostic information indicates the issue is related to sliders, specifically those caused by the third-party “keen slider” library. This library may be the root cause of the problem.
- FID - First Input Delay measures the time between a user’s first interaction with a page and when a browser can process the input. The value looks satisfied.
- TTFB - Time To First Byte measures the time between the request for a resource and the arrival of the first byte of the response. This value is also good.
- SI - Speed Index measures how quickly content is visually displayed during page load. The value is 1.8s, which is considered fast and acceptable.
Also there is one more option which should be mentioned TBT - Total Blocking Time. For now, for desktop value is not critical but highly recommended to be reviewed, as DOM is oversized which affects on styles computing and templates rerendering. This caused by non-optimized .svg files, which should be simplified, and large quantity of child elements in slider.
As shown above, SI is much slower, and Total Blocking Time is critical. The primary issue is script evaluation, which takes 2.6 seconds out of 4.1 seconds of main thread work (63%). Optimizing the code to avoid unnecessary calculations is likely needed.
Thorough code review
Code structure
The UI application uses the Next.js framework to implement SSR for better SEO and optimization. However, this approach may result in some poor performance metrics, as described in the performance section.
The code is structured according to NextJS’s standards. However, some files look messy, and they are difficult to maintain.
The import order should follow this structure: reusable (src functions) -> framework built-in/third-party modules -> neighboring modules -> styles.
Sometimes, imports take up a large portion of the file, as shown below. A potential solution is to create a file at the path “@sm/[folder-name]/[storage-file]” that stores all modules related to the parent folder’s responsibility.
However, the folder structure is not always clear. For example, src/types is intended to store TypeScript data types, but separate files responsible for type definitions and types defined in modules are scattered throughout the code. A better approach might be to store all types in one place.
To create a separate JSON file with static data, all static objects are declared in a single file. While this makes the folder structure clearer, it also impacts file readability.
Additionally, there are many hardcoded reusable values. It’s better to store such values as constants in separate files, making them easier to change or debug in the future. For instance, the word ‘developer’ appears 16 times as a role, and ‘Developer’ appears 3 times as a label. If the client requests to change ‘developer’ to ‘creator,’ it could lead to inconsistent results if not updated everywhere.
Although the application uses the .webp format for non-.svg images, the project may also contain .png versions, which seem to be legacy files (such as those in the header folder).
Styles
For styles, the SCSS preprocessor is used. Possibilities of this preprocessor are widely used among all applications. Generally, SCSS files average 150-200 lines, with some exceeding 300 lines. The number of these files is quite large and could potentially be optimized and shortened.
For instance, the “font-size” property appears 345 times, often alongside other font properties like “font-weight” and “line-height.” Creating a mixin that requires “font-size” as an argument may be beneficial, with other font-related parameters defaulting to “inherit.” Additionally, out of 345, the “font-size” with a “14px” value appears 92 times (1st place) and only 4 times within the media queries. Setting this as the default size could save 88+ lines of code.
Sometimes, rewritten properties occur (see the example below).
Sometimes, properties are identical for two selectors with one difference. A comma could easily separate them (see the example below from).
There are also some minor lexical errors. For example, the attribute “data-name=’dawn’” should be “data-name=’down’” according to the context. This is not critical unless screen readers use it, as the pronunciation differs.
API
The API is based on the Express.js framework using a modular approach. The possibility of database migration is provided and described in the README. The code is clear, concise, well-organized, and follows OOP principles.
Infrastructure Analysis
The infrastructure is robust and satisfactory. The deployment process is simple and requires no additional interventions other than setting up environment variables. The UI, API, and database all use the same cloud provider, positively impacting project speed. Services to prevent insecure interactions are also provided.
1. Broken page links
The website has links to 404 pages.
For example:
PDF for download with a 404 error:
- https://www.site.com/wp-content/uploads/2020/01/book-1.pdf
- https://www.site.com/wp-content/uploads/2020/01/book-1.pdf
Action requiredBroken links should be updated with accessible pages.
Here is a list of pages containing broken links:
Broken page links.
2. 3xx redirected pages
There is an incorrect link to the Privacy Policy in the footer.
The site uses: https://www.site.com/reference/privacy-policy
In this block: https://www.site.com/reference/privacy-policy-2021 — redirects to the correct version using a 301 Redirect.
Action requiredThe link should be replaced with the correct one.
Additionally, other internal links result in redirects; these also need to be updated to the correct URLs.
Here is a list of such links, the pages they are found on, and what they should be replaced with:
3xx page links.
3. Duplicate pages
The site contains duplicate pages of two types:
- Pages with GET parameters (campaigns).
- Pages in different directories.
Action requiredFor pages with GET parameters, the Canonical tag should be configured on static pages to point to themselves.
The main page should be selected for pages in different directories, and a 301 redirect should be set up to it. Then, links on the website should be updated to the correct ones.
4. Multiple H1 tags
Some pages on the site contain duplicate <h1> headers. Most of these duplications are found in the “Learn more with our resources” block.
Action requiredA decision is needed regarding whether these headers should remain as <h1> tags. While the error is not critical, addressing it would enhance the page’s logical structure.
5. Hidden blocks and texts
On the page https://www.site.com/offering/academy, there is a hidden block. We can see it by removing the CSS property “display: none” from a div with the class “page-wrapper hide.” The block reveals a header, main content, and footer when this property is removed. Essentially, it’s another page within the page, hidden from users but visible to Google’s robots.
Search engine algorithms might see such hidden content as manipulative behavior by the site owners, potentially leading to penalties or sanctions.
A screenshot of the page with the hidden block can be viewed here.
A screen recording demonstrating the appearance of the hidden block when the display: none property is deactivated can be viewed here.
On the page https://www.site.com/services, there is a section with the class “section_blog47 hide” that is not accessible to users due to the CSS property “display: none;”
On the page https://www.site.com/offering/keynote, there’s a section with the class “section-faq2 hide” that’s also not accessible to users due to the CSS property “display: none;”. This section contains a significant amount of content.
On the page https://www.site.com/about-us, sections with the classes “section-team3 hide” and “section-cta-block hide” are hidden. These sections are not accessible to website users but are visible to the Googlebot.
Additionally, there are some blocks of placeholder “Lorem Ipsum” text on some pages of the website. This can also have a negative impact on the overall quality of the site and its ranking in the top search results.
Placeholder text can be found on the pages https://www.site.com/services and https://www.site.com/resources-search.
Action requiredHidden placeholder text must be removed from all pages. Also, any sections or blocks in the code of pages hidden from users (but accessible to Google’s robots) must be removed by utilizing the CSS property “display: none;” Such elements typically have the “hide” class.
6. The link to the 404 page
On the page https://www.site.com/offering/academy, there is a link that leads to a page displaying a 404 error. Although it’s not accessible to website users, search engine bots can see this link in the page’s code and subsequently access it.
Action requiredThe link should be replaced with the up-to-date one:
https://www.site.com/resources.
7. Redirect chains
The website contains multiple redirect chains. These occur when several URLs redirect the user in sequence. For example, a chain of two redirects looks like this: URL 1 → URL 2 → URL 3.
It’s better to avoid redirect chains, as they slow down website performance and complicate the scanning process for robots.
Action requiredRedirect chains can be eliminated by setting up direct 301 redirects to the final URLs.
From http://site.com/results, a 301 redirect should be set up directly to https://www.site.com/results.
Similarly, from https://site.com/resources/, a 301 redirect should be set up directly to https://www.site.com/resources.
8. Links to pages with the HTTP protocol
Within the website, there are multiple hyperlinks. In their code, the address is specified in the HTTP protocol format. While these links do redirect users to pages using the HTTPS protocol, this could potentially impact the page loading speed, especially when accessing the site through mobile internet.
Action requiredThese links need to be replaced with those using the HTTPS protocol.
On the page https://www.site.com/services, a link should be changed from http://site.com/results to https://www.site.com/results. The link anchor is “View all.”
On the page https://www.site.com/post/the-creative-act-of-management, a link should be changed from http://www.site.com/ to https://www.site.com/results.
9. Links to pages with a 301 redirect
The website contains internal links to pages with a 301 redirect. Such links can complicate the process of site scanning for search engine bots and slow down the loading speed of the final page. For users, this situation might create a negative experience, leading them to leave the site before the desired page fully loads.
Action requiredLinks in the code need to be replaced with the final URLs: link to the document.
10. Pagination pages
10.1. Set up the URL
Pagination pages should feature friendly URLs, i.o. URLs without unnecessary numbers, symbols, or letters. Friendly URLs primarily serve to enhance user interaction with the website. They enable users to quickly understand which page of the site they are on.
As for the second page of pagination, its URL should include the expression “?page=2” in the GET parameter.
Action requiredTo resolve this issue, user-friendly parameters for pagination page URLs should be established, with a 301 redirect from the existing pages to the new URLs. The current list of pagination pages on the website is available through this link. Additionally, the links within the “Next” and “Previous” buttons should be updated to the new and pertinent URLs.
10.2. Set up a 301 redirect from the first pagination page to the main category pages.
When transitioning from pagination pages to the first page, the URL should not contain any GET parameters.
For example, when transitioning from the second pagination page https://www.site.com/topic/team-happiness?ca589ac6_page=2 to the first pagination page, the URL of the first one should look like this: https://www.site.com/topic/team-happiness.
Currently, the URL of the first pagination page appears as follows: https://www.site.com/topic/team-happiness?ca589ac6_page=1, and this page is an exact duplicate of https://www.site.com/topic/team-happiness. This duplication negatively impacts the ranking of category pages.
Action required A 301 redirect should be set up from all first pagination pages with GET parameters to the main category pages. Refer to our redirects table: link to the document.
Internal links should be updated to the new URLs without the GET parameter.
10.3. Optimize meta tags on pagination pages for uniqueness
Second, third, etc. pages of pagination should contain unique Title and Description meta tags to prevent Google from considering these pages as duplicates. To make meta tags for these pages unique, you need to add the page number in the existing ones.
Action requiredThe pagination page meta tags need to include the page numbers.
For example, The Title of the page https://www.site.com/topic/team-happiness?ca589ac6_page=2 should look like this:
Title: Team Happiness | Page 2
Description: All successful teams in high-growth organizations have certain things in common. Find out what they are and how to implement them on your team, starting today. | Page 2
11. The contact page is missing an H1 heading
Add tag H1 on the page https://www.site.com/contact-us. You can add the phrase “CONTACT US” to the H1, which is already present on the page.
This is not a critical error, but if possible, fix it.
12. Description of team members’ pages
In some team members’ pages, the Description tag is filled, but it is overly lengthy and duplicates the page’s content entirely. In others, the Description tag remains unfilled. While team members’ pages might not require tag optimization for specific queries, it’s still recommended to fill them to enhance EAT ranking factors.
Action requiredFor team pages, we suggest creating Descriptions using the template and adding unique meta tags to website pages. For author pages where the Description needs to be changed or added, please refer to the document link. The Description Template:
[Team Member’s Name]: [Position]. Brief description of the individual’s specialization and expertise.
Example Description: N***** M******-Sy: Coach. N****** is very happy to provide services to people who are defining their career and life plans, determining their future path, and moving forward.
13. Empty Description
In some pages, the Description tag is empty. But this is a crucial element of search engine optimization, as its content is displayed in search snippets. The more appealing the snippet, the higher the likelihood of users clicking through to the website.
If a page lacks a Description, search engines may automatically fill in this tag, and its content could potentially be irrelevant or uninteresting to users.
Action requiredWe have provided unique Descriptions for the pages that required them. They need to be added to the website, please refer to the document link.
14. Images without the “alt” attribute
The absence of “alt” attributes in image tags represents a missed opportunity for textual optimization of both images and the pages they are placed on. For pages where images constitute the primary content, using the “alt” attribute is critically important.
Action requiredEach significant image on the website needs to have an “alt” attribute, as it provides a brief and concise description of its content. Alternatively, the “alt” attribute can be filled using the template [Page H1 + Photo 1]. The list of pages with images lacking the “alt” tag can be found in the document link.
15. Robots.txt file
Currently, the file is left empty, allowing Google bots to freely scan all pages of the website, including those that should be restricted from indexing.
Pages like search pages, registration pages, and others fall into this category. Therefore, it’s necessary to block search engine bots’ access to these pages.
Action requiredWe have created a new robots.txt file, and it needs to be updated on the server to implement the necessary access restrictions.
User-agent: *
Disallow: /resources-search
Disallow: /login
Disallow: /log-in
Disallow: /sign-up
Disallow: /update-password
Disallow: *utm=
Disallow: ?ref=
Disallow: /search
Disallow: /access-denied
Disallow: #w-tabs
Sitemap: https://www.site.com/sitemap.xml
16. Link to the search page
To utilize the site search, you need to first navigate to the search page at https://www.site.com/resources-search. As a result, a non-priority page in terms of search engine optimization has a high number of internal links pointing to it.
Action requiredThe search functionality needs to be modified. When users click on the search icon, they shouldn’t be redirected to a new page. Instead, users should be able to enter their query in the search bar on the same page, and only after that, they need to be redirected to the search results page.
17. Page loading speed
A page loading speed analysis was conducted using the PageSpeed Insights tool, focusing on the page https://www.site.com/offering/academy. As you see, the loading speed of the mobile version can be improved. Some metrics show average or critical values. While this isn’t a critical issue, we recommend addressing certain aspects if possible.
Some pages have a too-long server response time (over 0.7 s): link to the document. Additionally, certain images have a large file size, contributing to poor loading metrics.
For instance, we see that scripts and videos have the longest loading times on the page. If possible, you should optimize them or set up lazy loading.
Heavy images and photos slow down the website’s performance. They should be optimized by reducing their size using specialized compressors before uploading them to the site. These tools compress images without compromising their quality. Additionally, utilizing more lightweight and modern formats (such as WEBP) and implementing lazy loading can further enhance the loading speed.
We recommend combining the main CSS files into a single file to reduce the number of server requests and removing any unused CSS styles. Afterward, you can further optimize the file using a minifier to reduce its size.
Additionally, it’s important to set up effective caching rules, especially for images, fonts, and heavy scripts. This way, when users revisit the website, page loading will be significantly faster.
Improvement ideas and proposed solutions
First, it would be beneficial to have well-written documentation and a description of the project’s basis, such as folder/file structure, commands, demands, general code-writing rules, and other things that will help newcomers with onboarding and maintaining good code quality. Additionally, it is advisable to encourage developers to include explanatory comments in their code.
Code refactoring and legacy removing also should be implemented before moving ahead as well as upgrading libraries.
More accurate recommendations were described in previous sections.
Conclusion
Overall, the website does not require any critical changes unless it won’t be extended in the future. The introduction of new features could complicate maintenance and lead to performance-related issues. Existing weak points of the project, such as redundant and duplicated styles, hardcoded values, CLS issues, etc., are likely to become more prominent if nobody considers them.
performance and gain
a competitive edge?
TECHNICAL AUDIT EXAMPLE
Thank you!
We will contact you ASAP!
Hmm...something went wrong. Please try again 🙏
Get your bonuses!
We can offer you additional services included in both plans. You’ve heard it right — 2 EXTRA ADD-ONS!
Don’t miss the chance, drop us a line.
- Report with your website’s problems
- 3 insights to solve your issues
Get your bonuses!
We can offer you additional services included in both plans.
You’ve heard it right — 2 EXTRA ADD-ONS!
includes:
- Report with your website’s problems
- 3 insights to solve your issues