diff --git a/CHANGELOG.md b/CHANGELOG.md new file mode 100644 index 0000000..a74bf3e --- /dev/null +++ b/CHANGELOG.md @@ -0,0 +1,44 @@ +# CHANGELOG + +## v4.0.1 + +- 🔨 Specify node version supported node >= 12 +- 🐛 Replace `??` with `||` to support node version before v14.5 [#73](https://github.com/kiliman/tailwindui-crawler/issues/73) + +## v4.0.0 + +- ✨ Support new tailwindui.com site structure +- ✨ Add ability to download new Page Templates `TEMPLATES=1` + +## v3.2.3 + +- 🐛 Fixes issue with parsing Alpine "preview" when not vaild HTML [#60](https://github.com/kiliman/tailwindui-crawler/issues/60) + +## v3.2.2 + +- 🔨 Replace email with License User in preview +- 📦 Update package versions + +## v3.2.1 + +- 🐛 Use correct selector to get component title due to change in format + +## v3.2.0 + +- ✨ Add bin to package [#51](https://github.com/kiliman/tailwindui-crawler/issues/51) +- ✨ Allow preview index to be relative to file not cwd [#50](https://github.com/kiliman/tailwindui-crawler/issues/50) +- ♻️ Refactor fetch to fetchWithRetry + +## v3.1.6 + +- 🔥 Remove optional chaining to support older node versions +- 🔥 Remove rewriting image URL now that images are downloaded locally + +## v3.1.5 + +- 😍 Add @Yagnik as contributor +- 🔨 Allow downloading images for preview [#48](https://github.com/kiliman/tailwindui-crawler/issues/48) + +## v3.1.4 + +- 🔨 Remove ?id=hash from HTML diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..71ce493 --- /dev/null +++ b/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) Michael Carter + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/README.md b/README.md new file mode 100644 index 0000000..26d8951 --- /dev/null +++ b/README.md @@ -0,0 +1,196 @@ +# tailwindui-crawler + + + +[![All Contributors](https://img.shields.io/badge/all_contributors-9-orange.svg?style=flat-square)](#contributors-) + + + + + +This script will crawl the [tailwindui.com](https://tailwindui.com) website and download all the +components to the `./output` folder. + +## 🛠 How to use + +To install, clone this repo and run `yarn` or `npm install` to pull down the dependencies. + +Then create a `.env` file with your email, password, and optional output folder. + +```ini +EMAIL=youremail +PASSWORD=yourpassword +# OUTPUT optional, defaults to ./output +OUTPUT=/path/to/output +# LANGUAGES defaults to html +LANGUAGES=html,react,vue,alpine +# BUILDINDEX generate index file to view components offline +BUILDINDEX=(0 | 1) +# TEMPLATES download template files +TEMPLATES=(0 | 1) +``` + +> NOTE: The tool uses [dotenv-expand](https://github.com/motdotla/dotenv-expand) +> to support variable expansion like `$HOME/path/to/output` so if your password +> or any other value includes a `$`, make sure you add a `\` (backslash) to +> escape the `$`. For example, `PASSWORD=p@\$\$w0rd` +> +> Also, `dotenv` does **not** support inline comments, so do not do something +> like `LANGUAGES=html,react,vue # some comment` as it will not get the correct +> values. + +Then finally, run `yarn start` or `npm start` + +The script will login to [tailwindui.com](https://tailwindui.com) with your credentials, and download all the +components as individual files in the `./output` folder. + +## 🤔 What's it for? + +The benefit of pulling down all the components is that you can commit them to a local or +private repo, and by running this periodically, you can see exactly which files were added +or changed. Hopefully, some time in the future, they will open up a private repo for those +that have purchased the library. + +## 🚀 New v4.0 + +The crawler has been updated to support the new Tailwind UI site as of 2022-06-27. You can +also download the new Page Templates. Add `TEMPLATES=1` to your _.env_ file. If you are using the GitHub action to crawl, you will need to update your deploy.yml. See instructions below. + +The crawler now supports the new Tailwind UI site and can download HTML, React +and Vue versions of the components. + +You can also download the "alpine" version of the components. ⚠️ WARNING: the +alpine code is **NOT** production ready. It does not support accesibility and +is used to show a preview of how the component interactivity can be implemented. + +It also adds the ability to generate an index page that emulates the [tailwindui.com](https://tailwindui.com) website +so you can browse components offline. + +### 🗂 Preview page + +You can set the `.env` key `BUILDINDEX=1` to have the crawler generate an index file similar to the components +page on [tailwindui.com](https://tailwindui.com). Install and run the [serve](https://www.npmjs.com/package/serve) package +to view the index. + +> NOTE: The HTML Preview does not apply transformers. It's a copy of the +> component site on [tailwindui.com](https://tailwindui.com). + +```bash +yarn global add serve +cd $OUTPUT/preview # change to your preview folder +serve +``` + +### ⚙️ Example `.env` file + +```ini +EMAIL=****** +PASSWORD=****** +OUTPUT=$HOME/Projects/tailwindui +LANGUAGES=html,react,vue,alpine +BUILDINDEX=1 +TEMPLATES=1 +``` + +## 🤖 Automatically keep a **private** GitHub Repository up-to-date + +**NOTE**: GitHub action has been updated in v4.0.0. Please make sure your _default.yml_ file is updated with the latest actions. + +You can automatically keep a **private** GitHub repository up-to-date with component changes from TailwindUI by using this tool with GitHub Actions. + +1. [Create a **private** GitHub repository](https://github.com/new/). +1. [Add `TAILWINDUI_EMAIL` and `TAILWINDUI_PASSWORD` secrets to the GitHub repository](https://help.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets#creating-encrypted-secrets). +1. [Optionally create a `.env` file with additional settings for the crawler](#%EF%B8%8F-example-env-file). +1. Create a new file `.github/workflows/default.yml`: + + ```yml + name: Update + on: + schedule: + - cron: '0 0 * * *' # Every day at midnight + + jobs: + update: + name: Update + runs-on: ubuntu-latest + steps: + - name: Checkout + uses: actions/checkout@v2 + - name: Run crawler + uses: kiliman/tailwindui-crawler-action@v1.3.0 + with: + email: ${{ secrets.TAILWINDUI_EMAIL }} + password: ${{ secrets.TAILWINDUI_PASSWORD }} + ``` + + > NOTE: Make sure to update to the latest action `v1.1.0` to support the crawler v3+ + + Read more about the schedule cron syntax in [the official GitHub Actions documentation](https://help.github.com/en/actions/reference/events-that-trigger-workflows#scheduled-events-schedule). + + > NOTE: if you're creating a new repository or have updated your default branch from `master`, you will have to specify with `branch` (and/or `current_branch`) like so: + + ```diff + # ... + - name: Run crawler + uses: kiliman/tailwindui-crawler-action@v1.3.0 + with: + email: ${{ secrets.TAILWINDUI_EMAIL }} + password: ${{ secrets.TAILWINDUI_PASSWORD }} + + branch: main + + current_branch: main + ``` + +### Email Notifications + +To be emailed whenever there is a change to a component, simply setup [GitHub Notifications](https://help.github.com/en/github/administering-a-repository/about-email-notifications-for-pushes-to-your-repository#enabling-email-notifications-for-pushes-to-your-repository) on your repository. + +## 🚦 Upgrading to v3. + +This is a major change. Unfortunately, v2 will no longer work with the existing +site due to the updates they may to add support for React/Vue components. Please also note that the [GitHub Action](#-automatically-keep-a-private-github-repository-up-to-date) has been updated from `v1.0.0` to `v1.1.0`. + +Currently, there is no support for transformers, as the need for them is not +as critical since the components don't need to be converted to React or Vue. + +NOTE: Since this script is essentially screen scraping, there's the potential +of it breaking if the HTML structure changes. I will do my best to keep it in sync with +the website. + +## 😍 Thank you + +Thanks to Adam and Steve for making an amazing library. This has definitely made creating +a UI for my applications a pleasant experience. + +Enjoy and let me know if you have any questions. + +Kiliman + +## Contributors ✨ + +Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)): + + + + + + + + + + + + + + + + + + +

Kiliman

💻

Simon Waloschek

💻

Pavel Fomchenkov

💻

Robin Malfait

💻

Miguel Piedrafita

💻 📖 🤔

Vlad Dumitrescu

📖

C-Bass

💻

Greg Brimble

📖 🔧

Yagnik

💻
+ + + + + + +This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome! diff --git a/images/index-component-code.png b/images/index-component-code.png new file mode 100644 index 0000000..147b9d3 Binary files /dev/null and b/images/index-component-code.png differ diff --git a/images/index-main.png b/images/index-main.png new file mode 100644 index 0000000..4707141 Binary files /dev/null and b/images/index-main.png differ diff --git a/images/index-section.png b/images/index-section.png new file mode 100644 index 0000000..666bc2c Binary files /dev/null and b/images/index-section.png differ diff --git a/images/tailwindui-crawler.png b/images/tailwindui-crawler.png new file mode 100644 index 0000000..0cef49d Binary files /dev/null and b/images/tailwindui-crawler.png differ diff --git a/index.mjs b/index.mjs new file mode 100644 index 0000000..93713f0 --- /dev/null +++ b/index.mjs @@ -0,0 +1,463 @@ +#!/usr/bin/env node +import * as cheerio from 'cheerio' +import * as dotenv from 'dotenv' +import dotenvExpand from 'dotenv-expand' +import * as fs from 'fs' +import https from 'https' +import * as _path from 'path' +import { exit } from 'process' +// polyfill matchAll for node versions < 12 +import matchAll from 'string.prototype.matchall' + +dotenvExpand(dotenv.config()) +matchAll.shim() + +const { dirname, basename } = _path + +export const kebab = (s) => s.toLowerCase().replace(/[^\w.]/g, '-') +export const camelCase = (s) => { + const matches = Array.from(s.matchAll(/[a-zA-Z0-9]+/g)) + return ( + matches[0][0].toLowerCase() + + matches + .slice(1) + .map(([item]) => item[0].toUpperCase() + item.substr(1).toLowerCase()) + .join('') + ) +} + +export const cleanFilename = (filename) => + filename + .toLowerCase() + .replace(/[^\w.]/g, '_') + .replace(/^_+|_+$/g, '') + +export const ensureDirExists = (dir) => { + if (!fs.existsSync(dir)) { + fs.mkdirSync(dir, { recursive: true }) + } +} + +export function mergeDeep(target, source) { + const isObject = (obj) => obj && typeof obj === 'object' + + if (!isObject(target) || !isObject(source)) { + return source + } + + Object.keys(source).forEach((key) => { + const targetValue = target[key] + const sourceValue = source[key] + + if (Array.isArray(targetValue) && Array.isArray(sourceValue)) { + target[key] = targetValue.concat(sourceValue) + } else if (isObject(targetValue) && isObject(sourceValue)) { + target[key] = mergeDeep(Object.assign({}, targetValue), sourceValue) + } else { + target[key] = sourceValue + } + }) + + return target +} + +const rootUrl = 'https://tailwindui.com' +const output = process.env.OUTPUT || './output' +// list of languages to save (defaults to html) +const languages = (process.env.LANGUAGES || 'html').split(',') +const retries = 3 +let oldAssets = {} +let newAssets = {} +const regexEmail = new RegExp(process.env.EMAIL.replace(/[.@]/g, '\\$&'), 'g') +let cookies = {} + +async function fetchHttps(url, options = {}, body = undefined) { + return new Promise((resolve, reject) => { + const uri = new URL(url) + options = { + hostname: uri.hostname, + port: uri.port || 443, + path: uri.pathname + uri.search, + method: 'GET', + ...options, + } + let response + const req = https.request(options, (res) => { + response = res + response.body = Buffer.alloc(0) + response.status = res.statusCode + response.text = async () => response.body.toString() + response.json = async () => JSON.parse(await response.text()) + response.arrayBuffer = async () => response.body.buffer + + const setCookieHeaders = response.headers['set-cookie'] + if (setCookieHeaders) { + const newCookies = parseSetCookieHeaders(setCookieHeaders) + cookies = { ...cookies, ...newCookies } + } + + res.on('data', (d) => { + response.body = Buffer.concat([response.body, d]) + }) + res.on('end', () => { + resolve(response) + }) + }) + + req.on('error', (error) => { + reject.err(error) + }) + if (body) { + req.write(body) + } + req.end() + }) +} + +async function fetchWithRetry(url, retries, options = {}) { + let tries = 0 + while (true) { + const start = new Date().getTime() + let response + let cookieHeader = getCookieHeader(cookies) + console.log(`🔍 Fetching ${url}`) + try { + response = await fetchHttps(url, { + ...options, + headers: { + ...options?.headers, + cookie: cookieHeader, + }, + }) + const elapsed = new Date().getTime() - start + console.log(`⏱ ${elapsed}ms (${response.status})`) + if (response.status === 302) { + return fetchWithRetry(response.headers.location, retries, options) + } + return response + } catch (err) { + console.error(err) + const elapsed = new Date().getTime() - start + tries++ + const status = response ? response.status : 500 + console.log(`🔄 ${elapsed}ms (${status}) Try #${tries} ${url}`) + if (tries === retries) { + console.log(`‼️ Error downloading ${url}.\n${err.message}`) + exit(1) + } + } + } +} + +function parseSetCookieHeaders(setCookieHeaders) { + let cookies = {} + setCookieHeaders.forEach((header) => { + const [cookie] = header.split(';') + const [name, value] = cookie.split('=') + cookies[name] = decodeURIComponent(value) + }) + return cookies +} + +async function downloadPage(url) { + const response = await fetchWithRetry(rootUrl + url, retries) + const html = await response.text() + return html.trim() +} + +async function postData(url, data) { + const body = JSON.stringify(data) + + return fetchHttps( + rootUrl + url, + { + method: 'POST', + headers: { + 'content-type': 'application/json', + 'content-length': Buffer.byteLength(body), + cookie: getCookieHeader(cookies), + 'x-inertia': 'true', + 'x-xsrf-token': cookies['XSRF-TOKEN'], + }, + }, + body, + ) +} + +function getCookieHeader(cookies) { + return ( + Object.entries(cookies) + //.map(([name, value]) => `${name}=${encodeURIComponent(value)}`) + .map(([name, value]) => `${name}=${value}`) + .join('; ') + ) +} + +async function processComponentPage(url) { + const html = await downloadPage(url) + if (!html.includes(process.env.EMAIL)) { + console.log(`🚫 Not logged in`) + process.exit() + } + const $ = cheerio.load(html) + // component data stored in #app data-page attribute + const json = $('#app').attr('data-page') + const data = JSON.parse(json) + + const components = data.props.subcategory.components + console.log( + `🔍 Found ${components.length} component${ + components.length === 1 ? '' : 's' + }`, + ) + + for (let i = 0; i < components.length; i++) { + await processComponent(url, components[i]) + } + + if (process.env.BUILDINDEX === '1') { + const preview = replaceTokens(html) + await savePageAndResources(url, preview, $) + } +} + +function replaceTokens(html) { + // replace tokens in page with constant so it won't generate superfluous diffs + // also replace links to css/js assets to remove id querystring + const regexTokens = /name="(csrf-token|_token)"\s+(content|value)="(.+?)"/gm + const regexAssets = /(css|js)(\?id=[a-f0-9]+)/gm + return html + .replace(regexTokens, `name="$1" $2="CONSTANT_TOKEN"`) + .replace(regexAssets, '$1') +} + +async function processComponent(url, component) { + const title = component.name + const filename = cleanFilename(title) + const path = `${url}/${filename}` + + // output snippets by language + component.snippets.forEach((snippet) => { + const language = snippet.language.toLowerCase() + if (!languages.includes(language)) return + saveLanguageContent(path, language, snippet.snippet) + }) + + // save resources required by snippet preview + const html = component.iframeHtml + // if languages contains alpine, then save the preview as alpine + if (languages.includes('alpine')) { + const $body = cheerio.load(html)('body') + // default code to body + let code = $body.html().trim() + // strip empty wrapper divs if present + let $container = findFirstElementWithClass($body.children().first()) + + if ($container) { + code = $container.parent().html().trim() + } + + const disclaimer = ` +` + saveLanguageContent(path, 'alpine', `${disclaimer}${code}`) + } + + await savePageAndResources(url, null, cheerio.load(html)) +} + +function findFirstElementWithClass($elem) { + // ignore empty class and elements with _style attribute + if ( + $elem.attr('class') && + $elem.attr('class').length > 0 && + !$elem.attr('_style') + ) { + return $elem + } + if ($elem.children().length === 0) return null + return findFirstElementWithClass($elem.children().first()) +} +async function saveLanguageContent(path, language, code) { + const ext = + language === 'react' ? 'jsx' : language === 'alpine' ? 'html' : language + const dir = `${output}/${language}${dirname(path)}` + ensureDirExists(dir) + + const filename = basename(path) + const filePath = `${dir}/${filename}.${ext}` + console.log(`📝 Writing ${language} ${filename}.${ext}`) + fs.writeFileSync(filePath, code) +} + +async function savePageAndResources(url, html, $) { + // download referenced css and js inside + const items = $('head>link,script,img') + for (let i = 0; i < items.length; i++) { + const $item = $(items[i]) + const url = $item.attr('src') || $item.attr('href') + if (!url || !url.startsWith('/')) continue + + // strip off querystring + const path = new URL(rootUrl + url).pathname + const dir = `${output}/preview${dirname(path)}` + const filePath = `${dir}/${basename(path)}` + // check assets to see if we've already downloaded this file + if (newAssets[filePath]) continue + + ensureDirExists(dir) + + let options = {} + if (oldAssets[filePath]) { + options = { + method: 'GET', + headers: { + 'If-None-Match': oldAssets[filePath], // etag from previous GET + }, + } + } + + const response = await fetchWithRetry(rootUrl + url, retries, options) + // check etag + if (response.status === 304) { + continue + } + newAssets[filePath] = response.headers['etag'] + + const content = await response.arrayBuffer() + fs.writeFileSync(filePath, Buffer.from(content)) + } + if (html) { + // write preview index page + const dir = `${output}/preview${url}` + ensureDirExists(dir) + html = html.replace(regexEmail, 'Licensed User') + fs.writeFileSync(`${dir}/index.html`, html) + console.log(`📝 Writing ${url}/index.html`) + } +} + +async function login() { + await downloadPage('/login') + + const response = await postData('/login', { + email: process.env.EMAIL, + password: process.env.PASSWORD, + remember: false, + }) + return response.status === 409 || response.status === 302 +} + +async function saveTemplates() { + const html = await downloadPage('/templates') + const $ = cheerio.load(html) + const $templates = $('section[id^="product"]') + console.log( + `🔍 Found ${$templates.length} template${ + $templates.length === 1 ? '' : 's' + }`, + ) + for (let i = 0; i < $templates.length; i++) { + const $template = $($templates[i]) + const $link = $template.find('h2>a') + const title = $link.text() + const url = $link.attr('href') + console.log(`🔍 Downloading template ${title}`) + + const path = new URL(url).pathname + const dir = `${output}${dirname(path)}` + const filePath = `${dir}/${basename(path)}.zip` + ensureDirExists(dir) + + let options = {} + if (oldAssets[filePath]) { + options = { + method: 'GET', + headers: { + 'If-None-Match': oldAssets[filePath], // etag from previous GET + }, + } + } + const response = await fetchWithRetry(url + '/download', retries, options) + // check etag + if (response.status === 304) { + continue + } + newAssets[filePath] = response.headers['etag'] + + const content = await response.arrayBuffer() + fs.writeFileSync(filePath, Buffer.from(content)) + } +} + +;(async function () { + const start = new Date().getTime() + try { + ensureDirExists(output) + // load old assets + if (fs.existsSync(`${output}/assets.json`)) { + oldAssets = JSON.parse(fs.readFileSync(`${output}/assets.json`)) + newAssets = JSON.parse(JSON.stringify(oldAssets)) + } + console.log('🔐 Logging into tailwindui.com...') + const success = await login() + if (!success) { + console.log('🚫 Invalid credentials') + return 1 + } + console.log('✅ Success!\n') + + console.log(`🗂 Output is ${output}`) + const html = await downloadPage('/components') + const $ = cheerio.load(html) + + const library = {} + const links = $('.grid a') + let urls = [] + for (let i = 0; i < links.length; i++) { + const link = links[i] + const url = $(link).attr('href') + if (!url.startsWith('/components')) continue + urls.push(url) + } + const count = process.env.COUNT || urls.length + for (let i = 0; i < count; i++) { + const url = urls[i] + console.log(`⏳ Processing ${url}...`) + const components = await processComponentPage(url) + mergeDeep(library, components) + console.log() + } + if (process.env.BUILDINDEX === '1') { + const preview = replaceTokens(html) + console.log('⏳ Saving preview page... this may take awhile') + await savePageAndResources('/components', preview, $) + fs.copyFileSync( + _path.join(process.cwd(), 'previewindex.html'), + `${output}/preview/index.html`, + ) + console.log() + } + if (process.env.TEMPLATES === '1') { + console.log('⏳ Saving templates...') + ensureDirExists(`${output}/preview`) + await saveTemplates() + console.log() + } + // save assets file + fs.writeFileSync( + `${output}/assets.json`, + JSON.stringify(newAssets, null, 2), + ) + } catch (err) { + console.error('‼️ ', err) + return 1 + } + const elapsed = new Date().getTime() - start + console.log(`🏁 Done! ${elapsed / 1000} seconds`) + return 0 +})() diff --git a/package.json b/package.json new file mode 100644 index 0000000..a643a9e --- /dev/null +++ b/package.json @@ -0,0 +1,30 @@ +{ + "name": "tailwindui-crawler", + "version": "4.0.1", + "description": "Download TailwindUI components", + "main": "index.mjs", + "type": "module", + "author": "kiliman ", + "license": "MIT", + "scripts": { + "contributors:add": "all-contributors add", + "contributors:generate": "all-contributors generate", + "start": "node index.mjs" + }, + "devDependencies": { + "all-contributors-cli": "^6.14.0", + "prettier": "^2.2.1" + }, + "dependencies": { + "cheerio": "^1.0.0-rc.6", + "cookie": "^0.5.0", + "dotenv": "^8.2.0", + "dotenv-expand": "^5.1.0", + "form-urlencoded": "^6.0.0", + "string.prototype.matchall": "^4.0.2" + }, + "bin": "./index.mjs", + "engines": { + "node": ">=12.0.0" + } +} diff --git a/previewindex.html b/previewindex.html new file mode 100644 index 0000000..0608add --- /dev/null +++ b/previewindex.html @@ -0,0 +1,11 @@ + + + + + + Tailwind UI Preview + + + View Components + + \ No newline at end of file diff --git a/transformers/addTailwindCss.js b/transformers/addTailwindCss.js new file mode 100644 index 0000000..3b54150 --- /dev/null +++ b/transformers/addTailwindCss.js @@ -0,0 +1,12 @@ +const meta = + '' +const url = + process.env.ADDTAILWINDCSS_URL || + 'https://tailwindui.com/css/components-v2.css' + +module.exports = function($) { + // add stylesheets to + $('head') + .append(meta) + .append(``) +} diff --git a/transformers/changeColor.js b/transformers/changeColor.js new file mode 100644 index 0000000..553cf8e --- /dev/null +++ b/transformers/changeColor.js @@ -0,0 +1,17 @@ +module.exports = function($, { rootUrl }) { + updateColor($, $('body')) +} + +function updateColor($, $parent) { + let _class = $parent.attr('class') + if (_class) { + $parent.attr( + 'class', + _class.replace(/\bindigo\b/g, process.env.CHANGECOLOR_TO), + ) + } + + $parent.children().each((_, child) => { + updateColor($, $(child)) + }) +} diff --git a/transformers/changeLogo.js b/transformers/changeLogo.js new file mode 100644 index 0000000..ef60026 --- /dev/null +++ b/transformers/changeLogo.js @@ -0,0 +1,8 @@ +module.exports = function($) { + $('img').each((_, img) => { + const $img = $(img) + const src = $img.attr('src') + const isLogo = src.indexOf('/logos/') !== -1 + $img.attr('src', isLogo ? process.env.CHANGELOGO_URL : src) + }) +} diff --git a/transformers/convertReact.js b/transformers/convertReact.js new file mode 100644 index 0000000..badf093 --- /dev/null +++ b/transformers/convertReact.js @@ -0,0 +1,115 @@ +const { ensureDirExists, camelCase } = require('../utils') + +module.exports = function($, { output, title, path, fs }) { + let code = $('body') + .html() + // Replace `class=` with `className=` + .replace(/class=/g, 'className=') + // Replace inline styles with style object + .replace(/style="([^"]*)"/g, (_, styles) => { + const regex = /(\s*([\w-]*)\s*:\s*([^;]+))/g + const matches = Array.from(styles.matchAll(regex)) + return `style={{${matches + .map(m => `${camelCase(m[2])}: "${m[3]}"`) + .join(',')}}}` + }) + // Replace all attributes starting with @. + // + // E.g.: `@click.stop` -> `data-todo-at-stop` + .replace( + / @([^"]*)=/g, + (_all, group) => ` data-todo-at-${group.replace(/[.:]/g, '-')}=`, + ) + + // Replaces all attributes starting with x-. + // + // E.g.: `x-transition:enter` -> `data-todo-x-transition-enter` + .replace( + / x-([^ "]*)/g, + (_all, group) => ` data-todo-x-${group.replace(/[.:]/g, '-')}`, + ) + + // Replace html comments with JSX comments + .replace(/()/g, '{/* $2 */}') + + // Replace `tabindex="0"` with `tabIndex={0}` + .replace(/tabindex="([^"]*)"/g, 'tabIndex={$1}') + + // Replace `datetime` with `dateTime` for