AI prompts
base on Turn any webpage into structured data using LLMs # LLM Scraper
<img width="1800" alt="Screenshot 2024-04-20 at 23 11 16" src="https://github.com/mishushakov/llm-scraper/assets/10400064/ab00e048-a9ff-43b6-81d5-2e58090e2e65">
LLM Scraper is a TypeScript library that allows you to extract structured data from **any** webpage using LLMs.
> [!IMPORTANT]
> **LLM Scraper was updated to version 1.6.**
>
> The new version comes with Vercel AI SDK 4 support, JSON Schema, better type-safety, improved code generation and updated examples.
> [!TIP]
> Under the hood, it uses function calling to convert pages to structured data. You can find more about this approach [here](https://til.simonwillison.net/gpt3/openai-python-functions-data-extraction).
### Features
- Supports GPT, Sonnet, Gemini, Llama, Qwen model series
- Schemas defined with Zod or JSON Schema
- Full type-safety with TypeScript
- Based on Playwright framework
- Streaming objects
- [Code-generation](#code-generation)
- Supports 4 formatting modes:
- `html` for loading pre-processed HTML
- `raw_html` for loading raw HTML (no processing)
- `markdown` for loading markdown
- `text` for loading extracted text (using [Readability.js](https://github.com/mozilla/readability))
- `image` for loading a screenshot (multi-modal only)
**Make sure to give it a star!**
<img width="165" alt="Screenshot 2024-04-20 at 22 13 32" src="https://github.com/mishushakov/llm-scraper/assets/10400064/11e2a79f-a835-48c4-9f85-5c104ca7bb49">
## Getting started
1. Install the required dependencies from npm:
```
npm i zod playwright llm-scraper
```
2. Initialize your LLM:
**OpenAI**
```
npm i @ai-sdk/openai
```
```js
import { openai } from '@ai-sdk/openai'
const llm = openai.chat('gpt-4o')
```
**Anthropic**
```
npm i @ai-sdk/anthropic
```
```js
import { anthropic } from '@ai-sdk/anthropic'
const llm = anthropic('claude-3-5-sonnet-20240620')
```
**Google**
```
npm i @ai-sdk/google
```
```js
import { google } from '@ai-sdk/google'
const llm = google('gemini-1.5-flash')
```
**Groq**
```
npm i @ai-sdk/openai
```
```js
import { createOpenAI } from '@ai-sdk/openai'
const groq = createOpenAI({
baseURL: 'https://api.groq.com/openai/v1',
apiKey: process.env.GROQ_API_KEY,
})
const llm = groq('llama3-8b-8192')
```
**Ollama**
```
npm i ollama-ai-provider
```
```js
import { ollama } from 'ollama-ai-provider'
const llm = ollama('llama3')
```
3. Create a new scraper instance provided with the llm:
```js
import LLMScraper from 'llm-scraper'
const scraper = new LLMScraper(llm)
```
## Example
In this example, we're extracting top stories from HackerNews:
```ts
import { chromium } from 'playwright'
import { z } from 'zod'
import { openai } from '@ai-sdk/openai'
import LLMScraper from 'llm-scraper'
// Launch a browser instance
const browser = await chromium.launch()
// Initialize LLM provider
const llm = openai.chat('gpt-4o')
// Create a new LLMScraper
const scraper = new LLMScraper(llm)
// Open new page
const page = await browser.newPage()
await page.goto('https://news.ycombinator.com')
// Define schema to extract contents into
const schema = z.object({
top: z
.array(
z.object({
title: z.string(),
points: z.number(),
by: z.string(),
commentsURL: z.string(),
})
)
.length(5)
.describe('Top 5 stories on Hacker News'),
})
// Run the scraper
const { data } = await scraper.run(page, schema, {
format: 'html',
})
// Show the result from LLM
console.log(data.top)
await page.close()
await browser.close()
```
Output
```js
[
{
title: "Palette lighting tricks on the Nintendo 64",
points: 105,
by: "ibobev",
commentsURL: "https://news.ycombinator.com/item?id=44014587",
},
{
title: "Push Ifs Up and Fors Down",
points: 187,
by: "goranmoomin",
commentsURL: "https://news.ycombinator.com/item?id=44013157",
},
{
title: "JavaScript's New Superpower: Explicit Resource Management",
points: 225,
by: "olalonde",
commentsURL: "https://news.ycombinator.com/item?id=44012227",
},
{
title: "\"We would be less confidential than Google\" Proton threatens to quit Switzerland",
points: 65,
by: "taubek",
commentsURL: "https://news.ycombinator.com/item?id=44014808",
},
{
title: "OBNC – Oberon-07 Compiler",
points: 37,
by: "AlexeyBrin",
commentsURL: "https://news.ycombinator.com/item?id=44013671",
}
]
```
More examples can be found in the [examples](./examples) folder.
## Streaming
Replace your `run` function with `stream` to get a partial object stream (Vercel AI SDK only).
```ts
// Run the scraper in streaming mode
const { stream } = await scraper.stream(page, schema)
// Stream the result from LLM
for await (const data of stream) {
console.log(data.top)
}
```
## Code-generation
Using the `generate` function you can generate re-usable playwright script that scrapes the contents according to a schema.
```ts
// Generate code and run it on the page
const { code } = await scraper.generate(page, schema)
const result = await page.evaluate(code)
const data = schema.parse(result)
// Show the parsed result
console.log(data.news)
```
## Contributing
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.
", Assign "at most 3 tags" to the expected json: {"id":"9564","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"