-
-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix /ffxiv/craftsim crawl #512
Conversation
WalkthroughThe changes involve modifications to the Changes
Sequence Diagram(s)sequenceDiagram
participant UserAgent as User Agent
participant Loader as Loader Function
participant RobotsTxt as robots.txt
UserAgent->>Loader: Request robots.txt
Loader->>RobotsTxt: Generate robotText
RobotsTxt-->>Loader: Return robotText
Loader-->>UserAgent: Send robots.txt response
sequenceDiagram
participant UserAgent as User Agent
participant Loader as Loader Function
participant Sitemap as sitemap.xml
UserAgent->>Loader: Request sitemap.xml
Loader->>Sitemap: Generate XML with dynamic URLs
Sitemap-->>Loader: Return XML document
Loader-->>UserAgent: Send sitemap.xml response
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Deploying saddlebag-with-pockets with Cloudflare Pages
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quality Gate passedIssues Measures |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
app/routes/[sitemap.xml].tsx (1)
Line range hint
1-3
: Consider implementing sitemap pagination for better performance.The current implementation generates URLs for all items at once, which could lead to very large sitemap files. Consider implementing sitemap pagination following Google's guidelines:
- Split sitemaps into smaller files (max 50,000 URLs or 50MB per file)
- Create a sitemap index file
- Consider different update frequencies for static vs dynamic content
Example implementation of a sitemap index:
// sitemapIndex.xml.tsx export const loader: LoaderFunction = async () => { const baseURL = 'https://saddlebagexchange.com' const currentDate = new Date().toISOString() return new Response( `<?xml version="1.0" encoding="UTF-8"?> <sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <sitemap> <loc>${baseURL}/static-pages-sitemap.xml</loc> <lastmod>${currentDate}</lastmod> </sitemap> <sitemap> <loc>${baseURL}/wow-items-sitemap.xml</loc> <lastmod>${currentDate}</lastmod> </sitemap> <sitemap> <loc>${baseURL}/ffxiv-items-sitemap.xml</loc> <lastmod>${currentDate}</lastmod> </sitemap> </sitemapindex>`, { status: 200, headers: { 'Content-Type': 'application/xml' } } ) }Also applies to: 14-15
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
- app/routes/[sitemap.xml].tsx (2 hunks)
🧰 Additional context used
🔇 Additional comments (2)
app/routes/[sitemap.xml].tsx (2)
184-203
: Verify the impact of removing multiple feature URLs from sitemap.Several important URLs have been commented out, including
/ffxiv/craftsim
which aligns with the PR objective. However, the removal of other URLs (/queries/full-scan, /ffxiv/marketshare, /wow/best-deals) needs verification:
- Are these features being deprecated?
- Will this impact user accessibility to these features?
- Is this coordinated with the frontend navigation updates?
119-128
: Verify the intentional removal of WoW export and marketshare URLs from sitemap.These URLs have been commented out rather than deleted. Please clarify if this is intentional and if these features are being deprecated or replaced.
Summary by CodeRabbit
robots.txt
to disallow web crawlers from accessing a broader path, enhancing site privacy and control over indexed content.