Think of your website like a car. You can tell if the engine sounds rough or if the paint is scratched, but without a proper diagnostic scan, you'd never know about that slowly failing sensor, the brake pads wearing thin, or the transmission fluid that's overdue for a change. A surface-level glance only catches what's obvious — the real problems hide underneath.
That's exactly what Greadme's Deep Scan does for your website. It performs a comprehensive analysis of a single URL, examining over 100 parameters across six critical categories: performance, SEO, accessibility, best practices, schema markup, and meta tags. Instead of guessing where your website falls short, you get a detailed diagnostic report with actionable recommendations that tell you exactly what needs fixing and why it matters.
Whether you're a developer optimizing for Core Web Vitals, a marketer trying to improve search rankings, or a business owner who simply wants their website to work better — Deep Scan gives you the data-driven clarity you need to make informed decisions.
Every Deep Scan produces scores across six distinct categories. Each category represents a fundamental pillar of website quality, and together they give you a holistic view of your page's strengths and weaknesses.
Performance measures how fast your page loads and how responsive it feels to users. This isn't just about speed for its own sake — Google uses Core Web Vitals as a ranking factor, and users abandon pages that take more than three seconds to load.
Deep Scan evaluates key performance metrics including:
Beyond the headline metrics, the scan also identifies specific bottlenecks like render-blocking resources, unminified JavaScript and CSS, unused code, unoptimized images, and missing text compression.
A one-second delay in page load time can reduce conversions by 7%. But performance isn't just about revenue — it's about equity. Users on slower connections or older devices are disproportionately affected by poor performance. Optimizing your site's speed means making it accessible to a wider audience.
The SEO category evaluates how well your page is optimized for search engine discovery and ranking. Deep Scan checks for technical SEO fundamentals that determine whether search engines can properly crawl, understand, and index your content.
Key SEO checks include:
Deep Scan includes a specialized detector for heavily client-side rendered (CSR) websites. If your page relies heavily on JavaScript to render its content, search engines may struggle to index it properly. When CSR is detected, the SEO score is reduced by up to 70 points to reflect the real-world impact on your search visibility. This is one of the most impactful findings Deep Scan can surface.
Accessibility measures how usable your website is for people with disabilities, including those using screen readers, keyboard navigation, or other assistive technologies. Beyond being the right thing to do, accessibility is increasingly a legal requirement in many jurisdictions.
Deep Scan evaluates accessibility against WCAG (Web Content Accessibility Guidelines) standards, checking for:
The Best Practices category covers security, browser compatibility, and adherence to modern web standards. These are the fundamentals that keep your site safe, reliable, and future-proof.
This includes checks for:
Meta tags control how your page appears in search results, social media shares, and browser tabs. Despite their simplicity, misconfigured or missing meta tags are one of the most common issues found during audits.
Deep Scan evaluates the following meta tags:
The scan also checks Open Graph tags (og:title, og:description, og:image, og:type, og:locale) and Twitter Card tags (twitter:card, twitter:title, twitter:description, twitter:image, twitter:site, twitter:creator) to ensure your content displays correctly when shared on social platforms.
Schema markup (structured data) helps search engines understand the meaning and context of your content, not just its text. Properly implemented schema can unlock rich results in search — like star ratings, FAQ dropdowns, product prices, and event dates appearing directly in search results.
Deep Scan doesn't just detect whether schema exists — it validates the schema against standards and checks for eligibility for specific rich result types. This is a critical distinction because having invalid or incomplete schema can be worse than having none at all.
Schema markup is treated as an optional category and is excluded from your main total score. This is intentional — not every page needs structured data. However, for pages where schema is relevant (product pages, articles, local business listings, events), it can be a powerful differentiator in search results.
Deep Scan produces individual scores for each of the six categories, plus an overall total score. Understanding how these scores work helps you interpret your results and prioritize improvements.
The total score aggregates the individual category scores (excluding the optional Schema Markup category) to give you a single number that represents your page's overall health. While the total score is a useful summary, the real value lies in the individual category breakdowns and the specific audit findings within each one.
A score of 100 in every category is rarely achievable or necessary. Some trade-offs are intentional — for example, a complex web application may sacrifice some performance for functionality. Focus on understanding what each finding means for your specific goals rather than obsessing over hitting a perfect number.
After the scan completes, your results are organized into expandable sections for each category. Each section contains individual audit findings with severity indicators that help you prioritize.
Every finding is classified by severity:
Each audit finding includes:
Deep Scan captures a final screenshot of your page as it appears after loading, along with a thumbnail timeline showing the visual loading progression. This is particularly valuable for understanding perceived performance — how quickly does the page look "ready" to your users?
One of Deep Scan's most powerful features is the optional AI-powered summary. When enabled, Greadme's AI analyzes your complete scan results and generates a prioritized, plain-language summary that highlights:
The AI summary transforms raw audit data into an actionable game plan. Instead of staring at dozens of technical findings and trying to figure out where to start, you get a clear narrative that tells you: "Here's what matters most, and here's what to do about it."
The AI summary feature has usage limits that vary by plan. Free users get a limited number of lifetime AI summaries, while paid plans include monthly quotas. The remaining AI summary count is displayed before you start each scan so you can decide when it's most valuable to use.
Deep Scan lets you choose between desktop and mobile analysis — and the choice matters more than you might think. The same page can produce dramatically different results depending on the device context.
Mobile analysis simulates a mid-tier mobile device on a slower network connection (4G). This means performance metrics like FCP and LCP will often be significantly slower than desktop results. Since Google uses mobile-first indexing, your mobile score is arguably more important for SEO than your desktop score.
We recommend running Deep Scans on both desktop and mobile for pages you're actively optimizing. Performance bottlenecks that are invisible on desktop (large images, heavy JavaScript bundles) often become critical issues on mobile where bandwidth and processing power are constrained.
Deep Scan results aren't just for your eyes. Depending on your plan, you can:
These sharing capabilities make Deep Scan a collaboration tool, not just a diagnostic one. When a developer needs to understand what SEO issues exist, or when a client asks "how is my website doing?" — you can send them a link to a detailed, professional report.
Greadme offers two distinct scanning approaches, and understanding when to use each one will save you time and give you better insights:
| Feature | Deep Scan | Crawl Scan |
|---|---|---|
| Scope | Single URL, comprehensive | Entire website, multiple pages |
| Depth of Analysis | 100+ parameters per page | Key SEO and content checks per page |
| Performance Metrics | Full Core Web Vitals | Not included |
| Schema Validation | Full validation with eligibility | Not included |
| Best Use Case | Optimizing a specific page | Auditing an entire site for issues |
| AI Summary | Available | Not available |
The ideal workflow is to start with a Crawl Scan to identify which pages have issues across your entire site, then use Deep Scans on the pages that matter most — your homepage, key landing pages, product pages, and any pages the Crawl Scan flagged as problematic.
After analyzing thousands of websites, certain issues appear consistently. Here are the most common problems Deep Scan finds and why they matter:
How common: Found on roughly 70% of scanned pages
The impact: Images are typically the largest assets on a page. Serving them without proper compression, in outdated formats (JPEG/PNG instead of WebP/AVIF), or without responsive sizing can add seconds to your load time.
The fix: Convert images to modern formats, implement responsive srcsets, and use lazy loading for below-the-fold images.
How common: Found on roughly 40% of scanned pages
The impact: Without a meta description, search engines generate their own snippet from your page content. These auto-generated snippets are rarely as compelling as a well-crafted description, leading to lower click-through rates.
The fix: Write unique, compelling meta descriptions (120-160 characters) for every important page that clearly communicate what the user will find.
How common: Found on roughly 60% of scanned pages
The impact: CSS and JavaScript files that block the initial render delay when users first see content. This directly impacts FCP and LCP metrics.
The fix: Inline critical CSS, defer non-essential JavaScript, and use async loading for third-party scripts.
How common: Found on roughly 50% of scanned pages
The impact: Screen reader users can't understand what images represent. Search engines also use alt text to understand image content.
The fix: Add descriptive alt text to every meaningful image. Decorative images should use an empty alt attribute (alt="").
How common: Found on roughly 65% of scanned pages
The impact: Without structured data, your pages miss opportunities for rich results in search — star ratings, FAQ dropdowns, product info, and more.
The fix: Implement JSON-LD schema markup relevant to your page type (Article, Product, FAQ, LocalBusiness, etc.).
Run a Deep Scan before making optimizations to establish a baseline, then scan again after implementing fixes. Comparing the two results gives you concrete proof of improvement and helps identify if any changes introduced new issues.
A page scoring 45 in performance but 90 in accessibility probably needs performance work. But if that page is a rarely-visited internal tool, while a high-traffic landing page scores 80 in performance and 60 in SEO — fix the landing page first. Context matters more than raw numbers.
Since AI summaries have usage limits, save them for pages you're actively working on improving. The AI summary is most valuable when you have a page with many findings and need help deciding where to start.
The Best Practices category often gets overlooked in favor of performance and SEO, but issues like missing HTTPS, deprecated APIs, and console errors can erode user trust and security. These are often the easiest fixes with the most immediate impact on credibility.
Export your scan results to Excel or generate a shareable link. Technical findings are most useful when the people who can fix them — developers, designers, content writers — have direct access to the data.
Some websites use firewalls, bot protection, or WAF (Web Application Firewall) systems that may block Greadme's analysis requests. When this happens, Deep Scan detects the block and provides you with:
If you control the website being scanned, you can add Greadme's bot to your allowlist to enable complete analysis. This is a one-time configuration that ensures future scans proceed without interruption.
Deep Scan exists because surface-level analysis isn't enough. The difference between a website that ranks, converts, and serves its users well — and one that doesn't — often comes down to dozens of small technical details that are invisible to the naked eye but crystal clear in a comprehensive audit.
By analyzing over 100 parameters across performance, SEO, accessibility, best practices, meta tags, and schema markup, Deep Scan gives you the complete picture. No guessing, no assumptions — just data-driven insights with clear paths to improvement.
The most successful websites aren't built and forgotten. They're continuously monitored, measured, and improved. Deep Scan is the diagnostic tool that makes that continuous improvement possible — turning complex technical analysis into clear, actionable steps that anyone on your team can understand and execute.
Start with the pages that matter most to your business. Run a Deep Scan. Read the findings. Make the fixes. Scan again. That cycle of measure, improve, verify is the foundation of every high-performing website on the internet.
Run a Deep Scan on any URL and get a comprehensive analysis of 100+ parameters across performance, SEO, accessibility, best practices, meta tags, and schema markup — with actionable recommendations to improve every score.
Run Your First Deep Scan