ripgrep
til/readthedocs/custom-subdomain.md
1 # Pointing a custom subdomain at Read the Docs
2
3 I host the documentation for Datasette on [Read the Docs](https://readthedocs.org/). Until today it lived at https://datasette.readthedocs.io/ but today I moved it to a custom subdomain, https://docs.datasette.io/
4
5 Most of this is handled by the https://readthedocs.org/dashboard/datasette/domains/ interface, but there were a couple of extra details.
9 First, I needed to add a `CNAME` record for `docs.datasette.io` pointing to `readthedocs.org.`
10
11 It's good to do that in advance of adding the domain to Read the Docs, because when you add the domain they instantly start redirecting traffic from the old `datasette.readthedocs.io` domain to the new custom `docs.datasette.io` domain - even if that domain hasn't finished updating DNS and issuing certificates yet!
12
13 When I first tried this I got the following error in the *Read the Docs* interface:
til/datasette/crawling-datasette-with-datasette.md
5 To do this, I needed the content of those tutorials in a SQLite database table. But the tutorials are implemented as static pages in [templates/pages/tutorials](https://github.com/simonw/datasette.io/tree/9dffe361b0210b9d8b1f2fb820a3f2193f0f2fc7/templates/pages/tutorials) - so I needed to crawl that content and insert it into a table.
6
7 I ended up using a combination of the `datasette.client` mechanism ([documented here](https://docs.datasette.io/en/stable/internals.html#internals-datasette-client)), [Beautiful Soup](https://www.crummy.com/software/BeautifulSoup/bs4/doc/) and [sqlite-utils](https://sqlite-utils.readthedocs.io/) - all wrapped up in [a Python script](https://github.com/simonw/datasette.io/blob/9dffe361b0210b9d8b1f2fb820a3f2193f0f2fc7/index_tutorials.py) that's now called as part of [the GitHub Actions build process](https://github.com/simonw/datasette.io/blob/9dffe361b0210b9d8b1f2fb820a3f2193f0f2fc7/scripts/build.sh#L35) for the site.
8
9 I'm also using [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#config-dir).
sba-loans-covid-19-datasette/README.md
49 I downloaded the "6-digit 2017 Code File" XLS file from https://www.census.gov/eos/www/naics/downloadables/downloadables.html and opened it in Numbers, then exported the data back out again as a two column CSV: [naics_2017.csv](https://github.com/simonw/sba-loans-covid-19-datasette/blob/main/naics_2017.csv)
50
51 Then I ran the following commands - using [sqlite-utils](https://sqlite-utils.readthedocs.io/en/stable/cli.html) - to import that data and set it up as a foreign key from the `NAICSCode` column.
52
53 # First create the table with an integer primary key and a text column
pocket-to-sqlite/README.md
42 ## Using with Datasette
43
44 The SQLite database produced by this tool is designed to be browsed using [Datasette](https://datasette.readthedocs.io/). Use the [datasette-render-timestamps](https://github.com/simonw/datasette-render-timestamps) plugin to improve the display of the timestamp values.
homebrew-datasette/README.md
13 ## Installing plugins
14
15 [Datasette plugins](https://datasette.readthedocs.io/en/stable/plugins.html) need to be installed into the same Python environment as `datasette`.
16
17 The easiest way to install them is to use the new `datasette install` command:
homebrew-datasette/Formula/sqlite-utils.rb
2 include Language::Python::Virtualenv
3 desc "CLI utility for manipulating SQLite databases"
4 homepage "https://sqlite-utils.readthedocs.io/"
5 url "https://files.pythonhosted.org/packages/6a/9d/42871b729018c48ecdf9f14f14e5f6fc99416a088b6040b32ce494528ddb/sqlite-utils-2.21.tar.gz"
6 sha256 "fdb3ac8a2ce7da4253a04d9e57b7a1bbb4c2d756416fd9ae5c4459002453edc7"
homebrew-datasette/Formula/datasette.rb
2 include Language::Python::Virtualenv
3 desc "Open source multi-tool for exploring and publishing data"
4 homepage "https://datasette.readthedocs.io/"
5 url "https://files.pythonhosted.org/packages/f2/ba/1b5f182c3f1769c0863bcaa77406bdcb81c92e31bb579959c01b1d8951c0/datasette-0.50.2.tar.gz"
6 sha256 "72e3127a5007103e2b2e7e35172d7da256471c54370447199ffafb631526c0b4"
healthkit-to-sqlite/README.md
31 ```
32
33 You can explore the resulting data using [Datasette](https://datasette.readthedocs.io/) like this:
34
35 $ datasette healthkit.db
geojson-to-sqlite/README.md
87 ## Using this with Datasette
88
89 Databases created using this tool can be explored and published using [Datasette](https://datasette.readthedocs.io/).
90
91 The Datasette documentation includes a section on [how to use it to browse SpatiaLite databases](https://datasette.readthedocs.io/en/stable/spatialite.html).
92
93 The [datasette-leaflet-geojson](https://datasette.io/plugins/datasette-leaflet-geojson) plugin can be used to visualize columns containing GeoJSON geometries on a [Leaflet](https://leafletjs.com/) map.
genome-to-sqlite/README.md
25 ```
26
27 You can explore the resulting data using [Datasette](https://datasette.readthedocs.io/) like this:
28
29 $ datasette genome.db --config facet_time_limit_ms:1000
fara-datasette/README.md
5 > This repository has been replaced by https://github.com/simonw/fara-history
6
7 This code pulls the latest CSVs from https://efile.fara.gov/ords/f?p=API:BULKDATA and loads them into a SQLite database suitable for publishing using [Datasette](https://datasette.readthedocs.io/)
8
9
dogsheep-photos/README.md
137 ### Displaying images using custom template pages
138
139 Datasette's [custom pages](https://datasette.readthedocs.io/en/stable/custom_templates.html#custom-pages) feature lets you create custom pages for a Datasette instance by dropping HTML templates into a `templates/pages` directory and then running Datasette using `datasette --template-dir=templates/`.
140
141 You can combine that ability with the [datasette-template-sql](https://github.com/simonw/datasette-template-sql) plugin to create custom template pages that directly display photos served by `datasette-media`.
datasette-upload-csvs/README.md
16 The plugin adds an interface at `/-/upload-csvs` for uploading a CSV file and using it to create a new database table.
17
18 By default only [the root actor](https://datasette.readthedocs.io/en/stable/authentication.html#using-the-root-actor) can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page.
19
20 The `upload-csvs` permission governs access. You can use permission plugins such as [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to grant additional access to the write interface.
datasette-sentry/README.md
29 }
30 ```
31 Settings in `metadata.json` are visible to anyone who visits the `/-/metadata` URL so this is a good place to take advantage of Datasette's [secret configuration values](https://datasette.readthedocs.io/en/stable/plugins.html#secret-configuration-values), in which case your configuration will look more like this:
32 ```json
33 {
datasette-render-timestamps/README.md
19 ## Configuration
20
21 You can disable automatic column detection in favour of explicitly listing the columns that you would like to render using [plugin configuration](https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration) in a `metadata.json` file.
22
23 Add a `"datasette-render-timestamps"` configuration block and use a `"columns"` key to list the columns you would like to treat as timestamp values:
datasette-render-markdown/README.md
16 ## Usage
17
18 You can explicitly list the columns you would like to treat as Markdown using [plugin configuration](https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration) in a `metadata.json` file.
19
20 Add a `"datasette-render-markdown"` configuration block and use a `"columns"` key to list the columns you would like to treat as Markdown values:
datasette-permissions-sql/README.md
15 ## Usage
16
17 First, read up on how Datasette's [authentication and permissions system](https://datasette.readthedocs.io/en/latest/authentication.html) works.
18
19 This plugin lets you define rules containing SQL queries that are executed to see if the currently authenticated actor has permission to perform certain actions.
78 `"database"` is also optional: it specifies the named database that the query should be executed against. If it is not present the first connected database will be used.
79
80 The Datasette documentation includes a [list of built-in permissions](https://datasette.readthedocs.io/en/stable/authentication.html#built-in-permissions) that you might want to use here.
81
82 ### The SQL query
datasette-insert/README.md
153 This plugin defaults to denying all access, to help ensure people don't accidentally deploy it on the open internet in an unsafe configuration.
154
155 You can read about [Datasette's approach to authentication](https://datasette.readthedocs.io/en/stable/authentication.html) in the Datasette manual.
156
157 You can install the `datasette-insert-unsafe` plugin to run in unsafe mode, where all access is allowed by default.
159 I recommend using this plugin in conjunction with [datasette-auth-tokens](https://github.com/simonw/datasette-auth-tokens), which provides a mechanism for making authenticated calls using API tokens.
160
161 You can then use ["allow" blocks](https://datasette.readthedocs.io/en/stable/authentication.html#defining-permissions-with-allow-blocks) in the `datasette-insert` plugin configuration to specify which authenticated tokens are allowed to make use of the API.
162
163 Here's an example `metadata.json` file which restricts access to the `/-/insert` API to an API token defined in an `INSERT_TOKEN` environment variable:
231 You can use plugins like [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to hook into these more detailed permissions for finely grained control over what actions each authenticated actor can take.
232
233 Plugins that implement the [permission_allowed()](https://datasette.readthedocs.io/en/stable/plugin_hooks.html#plugin-hook-permission-allowed) plugin hook can take full control over these permission decisions.
234
235 ## CORS
datasette-ics/README.md
41 ## Using a canned query
42
43 Datasette's [canned query mechanism](https://datasette.readthedocs.io/en/stable/sql_queries.html#canned-queries) can be used to configure calendars. If a canned query definition has a `title` that will be used as the title of the calendar.
44
45 Here's an example, defined using a `metadata.yaml` file:
datasette-import-table/README.md
18 Visit `/-/import-table` for the interface. Paste in the URL to a table page on another Datasette instance and click the button to import that table.
19
20 By default only [the root actor](https://datasette.readthedocs.io/en/stable/authentication.html#using-the-root-actor) can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page.
21
22 The `import-table` permission governs access. You can use permission plugins such as [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to grant additional access to the write interface.
datasette-edit-schema/README.md
33 Use `/-/edit-schema/dbname` to create a new table in a specific database.
34
35 By default only [the root actor](https://datasette.readthedocs.io/en/stable/authentication.html#using-the-root-actor) can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page.
36
37 ## Permissions
datasette-csvs/requirements.txt
13
14 # More plugins here:
15 # https://datasette.readthedocs.io/en/stable/ecosystem.html#datasette-plugins
datasette-csvs/metadata.json
1 {
2 "title": "datasette-csvs",
3 "description_html": "<p>An example of how to run <a href=\"https://datasette.readthedocs.io/\">Datasette</a> on Glitch.</p><p>Hit the Remix button to get your own copy of this project, then drag-and-drop your CSV files into the Glitch editor to start interacting with your own data.<p><a href=\"https://glitch.com/edit/#!/remix/datasette-csvs\"><img src=\"https://cdn.glitch.com/2703baf2-b643-4da7-ab91-7ee2a2d00b5b%2Fremix-button.svg\" alt=\"Remix on Glitch\" /></a></p>"
4 }
datasette-configure-fts/README.md
20 Any time you have permission to configure FTS for a table a menu item will appear in the table actions menu on the table page.
21
22 By default only [the root actor](https://datasette.readthedocs.io/en/stable/authentication.html#using-the-root-actor) can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page.
23
24 The `configure-fts` permission governs access. You can use permission plugins such as [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to grant additional access to the write interface.
datasette-car-2019/README.md
5 ## Some introductory information
6
7 * [Datasette](https://datasette.readthedocs.io/)
8 * [The Datasette Ecosystem](https://datasette.readthedocs.io/en/stable/ecosystem.html)
9
10 Two of my projects:
43 https://nicar-2019.herokuapp.com/nicar-459dd1d/sessions
44
45 Datasette documentation is at https://datasette.readthedocs.io/
46
47 The `csvs-to-sqlite` documentation is at https://github.com/simonw/csvs-to-sqlite
112 All of this currently just lives on your local machine. How about publishing it to the internet?
113
114 Datasette is designed to be run on servers. You can set up your own server for this, but by far the easiest way to get your data live is to use [Heroku](https://www.heroku.com/). Datasette has a built-in command [for publishing using Heroku](https://datasette.readthedocs.io/en/stable/publish.html#publishing-to-heroku) which looks like this:
115
116 datasette publish heroku nicar.db --title "CAR 2019 schedule" --source_url=https://www.ire.org/events-and-training/conferences/nicar-2019
118 Before you can run this you'll need to be signed in with a Heroku account on your machine. You can do that by running `heroku login`.
119
120 This will publish your database and return a new URL where you can browse it online. The `--title` and `--source_url` options can be used to attach metadata to the new deployment, see [the docs](https://datasette.readthedocs.io/en/stable/publish.html#publishing-to-heroku) for additional options.
121
122 ## California Civic Data
datasette-auth-passwords/README.md
78 ### Specifying actors
79
80 By default, a logged in user will result in an [actor block](https://datasette.readthedocs.io/en/stable/authentication.html#actors) that just contains their username:
81
82 ```json
123 ### Using with datasette publish
124
125 If you are publishing data using a [datasette publish](https://datasette.readthedocs.io/en/stable/publish.html#datasette-publish) command you can use the `--plugin-secret` option to securely configure your password hashes (see [secret configuration values](https://datasette.readthedocs.io/en/stable/plugins.html#secret-configuration-values)).
126
127 You would run the command something like this:
datasette-auth-tokens/README.md
16 ## Hard-coded tokens
17
18 Read about Datasette's [authentication and permissions system](https://datasette.readthedocs.io/en/latest/authentication.html).
19
20 This plugin lets you configure secret API tokens which can be used to make authenticated requests to Datasette.
67 }
68 ```
69 This uses Datasette's [secret configuration values mechanism](https://datasette.readthedocs.io/en/stable/plugins.html#secret-configuration-values) to allow the secret token to be passed as an environment variable.
70
71 Run Datasette like this:
249 To avoid this, you should lock down access to that table. The configuration example above shows how to do this using an `"allow": false` block to deny all access to that `tokens` database.
250
251 Consult Datasette's [Permissions documentation](https://datasette.readthedocs.io/en/stable/authentication.html#permissions) for more information about how to lock down this kind of access.
datasette-auth-existing-cookies/README.md
24 ## Configuration
25
26 This plugin requires some configuration in the Datasette [metadata.json file](https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration).
27
28 The following configuration options are supported: