forked from Fediversity/Fediversity
Compare commits
12 commits
990777e7fd
...
e562817597
Author | SHA1 | Date | |
---|---|---|---|
e562817597 | |||
586c3b851a | |||
c2db12a735 | |||
57d53a1d22 | |||
740b5447d8 | |||
142af8d0ee | |||
3ec09b491d | |||
01de49d096 | |||
06d3d37a39 | |||
10f3d15a98 | |||
92563d387a | |||
fb64d2b9c9 |
59 changed files with 2189 additions and 1566 deletions
219
deployment/README.md
Normal file
219
deployment/README.md
Normal file
|
@ -0,0 +1,219 @@
|
|||
# Provisioning VMs via Proxmox
|
||||
|
||||
## Quick links
|
||||
|
||||
Proxmox API doc
|
||||
: <https://pve.proxmox.com/pve-docs/api-viewer>
|
||||
|
||||
Fediversity Proxmox
|
||||
: <http://192.168.51.81:8006/>
|
||||
|
||||
## Basic terminology
|
||||
|
||||
Node
|
||||
: physical host
|
||||
|
||||
## Fediversity Proxmox
|
||||
|
||||
- It is only accessible via Procolix\'s VPN:
|
||||
- Get credentials for the VPN portal and Proxmox from
|
||||
[Kevin](https://git.fediversity.eu/kevin).
|
||||
|
||||
- Log in to the [VPN
|
||||
portal](https://vpn.fediversity.eu/vpn-user-portal/home).
|
||||
|
||||
- Create a **New Configuration**:
|
||||
- Select **WireGuard (UDP)**
|
||||
- Enter some name, e.g. `fediversity`
|
||||
- Click Download
|
||||
|
||||
- Write the WireGuard configuration to a file
|
||||
`fediversity-vpn.config` next to your NixOS configuration
|
||||
|
||||
- Add that file's path to `.git/info/exclude` and make sure
|
||||
it doesn't otherwise leak (for example, use
|
||||
[Agenix](https://github.com/ryantm/agenix) to manage
|
||||
secrets)
|
||||
|
||||
- To your NixOS configuration, add
|
||||
|
||||
``` nix
|
||||
networking.wg-quick.interfaces.fediversity.configFile = toString ./fediversity-vpn.config;
|
||||
```
|
||||
- Select "Promox VE authentication server".
|
||||
- Ignore the "You do not have a valid subscription" message.
|
||||
|
||||
## Automatically
|
||||
|
||||
This directory contains scripts that can automatically provision or
|
||||
remove a Proxmox VM. For now, they are tied to one node in the
|
||||
Fediversity Proxmox, but it would not be difficult to make them more
|
||||
generic. Try:
|
||||
|
||||
```sh
|
||||
bash proxmox/provision.sh --help
|
||||
bash proxmox/remove.sh --help
|
||||
```
|
||||
|
||||
## Preparing the machine configuration
|
||||
|
||||
- It is nicer if the machine is a QEMU guest. On NixOS:
|
||||
|
||||
``` nix
|
||||
services.qemuGuest.enable = true
|
||||
```
|
||||
|
||||
- Choose name for your machine.
|
||||
|
||||
- Choose static IPs for your machine. The IPv4 and IPv6 subnets
|
||||
available for Fediversity testing are:
|
||||
|
||||
- `95.215.187.0/24`. Gateway is `95.215.187.1`.
|
||||
- `2a00:51c0:13:1305::/64`. Gateway is `2a00:51c0:13:1305::1`.
|
||||
|
||||
- I have been using id `XXX` (starting from `001`), name `fediXXX`,
|
||||
`95.215.187.XXX` and `2a00:51c0:13:1305::XXX`.
|
||||
|
||||
- Name servers should be `95.215.185.6` and `95.215.185.7`.
|
||||
|
||||
- Check [Netbox](https://netbox.protagio.org) to see which addresses
|
||||
are free.
|
||||
|
||||
## Manually via the GUI
|
||||
|
||||
### Upload your ISO
|
||||
|
||||
- Go to Fediversity proxmox.
|
||||
- In the left view, expand under the node that you want and click on
|
||||
"local".
|
||||
- Select "ISO Images", then click "Upload".
|
||||
- Note: You can also download from URL.
|
||||
- Note: You should click on "local" and not "local-zfs".
|
||||
|
||||
### Creating the VM
|
||||
|
||||
- Click "Create VM" at the top right corner.
|
||||
|
||||
#### General
|
||||
|
||||
Node
|
||||
: which node will host the VM; has to be the same
|
||||
|
||||
VM ID
|
||||
: Has to be unique, probably best to use the `xxxx` in `vm0xxxx`
|
||||
(yet to be decided)
|
||||
|
||||
Name
|
||||
: Usually `vm` + 5 digits, e.g. `vm02199`
|
||||
|
||||
Resource pool
|
||||
: Fediversity
|
||||
|
||||
#### OS
|
||||
|
||||
Use CD/DVD disc image file (iso)
|
||||
|
||||
:
|
||||
|
||||
Storage
|
||||
: local, means storage of the node.
|
||||
|
||||
ISO image
|
||||
: select the image previously uploaded
|
||||
|
||||
No need to touch anything else
|
||||
|
||||
#### System
|
||||
|
||||
BIOS
|
||||
: OVMF (UEFI)
|
||||
|
||||
EFI Storage
|
||||
: `linstor_storage`; this is a storage shared by all of the Proxmox
|
||||
machines.
|
||||
|
||||
Pre-Enroll keys
|
||||
: MUST be unchecked
|
||||
|
||||
Qemu Agent
|
||||
: check
|
||||
|
||||
#### Disks
|
||||
|
||||
- Tick "advanced" at the bottom.
|
||||
- Disk size (GiB) :: 40 (depending on requirements)
|
||||
- SSD emulation :: check (only visible if "Advanced" is checked)
|
||||
- Discard :: check, so that blocks of removed data are cleared
|
||||
|
||||
#### CPU
|
||||
|
||||
Sockets
|
||||
: 1 (depending on requirements)
|
||||
|
||||
Cores
|
||||
: 2 (depending on requirements)
|
||||
|
||||
Enable NUMA
|
||||
: check
|
||||
|
||||
#### Memory
|
||||
|
||||
Memory (MiB)
|
||||
: choose what you want
|
||||
|
||||
Ballooning Device
|
||||
: leave checked (only visible if "Advanced" is checked)
|
||||
|
||||
#### Network
|
||||
|
||||
Bridge
|
||||
: `vnet1306`. This is the provisioning bridge;
|
||||
we will change it later.
|
||||
|
||||
Firewall
|
||||
: uncheck, we will handle the firewall on the VM itself
|
||||
|
||||
#### Confirm
|
||||
|
||||
### Install and start the VM
|
||||
|
||||
- Start the VM a first time.
|
||||
- Select the VM in the left panel. You might have to expand the
|
||||
node on which it is hosted.
|
||||
- Select "Console" and start the VM.
|
||||
- Install the VM as you would any other machine.
|
||||
- [*Shutdown the VM*]{.spurious-link target="Shutdown the VM"}.
|
||||
- After the VM has been installed:
|
||||
- Select the VM again, then go to "Hardware".
|
||||
- Double click on the CD/DVD Drive line. Select "Do not use any
|
||||
media" and press OK.
|
||||
- Double click on Network Device, and change the bridge to
|
||||
`vnet1305`, the public bridge.
|
||||
- Start the VM again.
|
||||
|
||||
### Remove the VM
|
||||
|
||||
- [*Shutdown the VM*]{.spurious-link target="Shutdown the VM"}.
|
||||
- On the top right corner, click "More", then "Remove".
|
||||
- Enter the ID of the machine.
|
||||
- Check "Purge from job configurations"
|
||||
- Check "Destroy unreferenced disks owned by guest"
|
||||
- Click "Remove".
|
||||
|
||||
### Move the VM to another node
|
||||
|
||||
- Make sure there is no ISO plugged in.
|
||||
- Click on the VM. Click migrate. Choose target node. Go.
|
||||
- Since the storage is shared, it should go pretty fast (~1 minute).
|
||||
|
||||
### Shutdown the VM
|
||||
|
||||
- Find the VM in the left panel.
|
||||
- At the top right corner appears a "Shutdown" button with a submenu.
|
||||
- Clicking "Shutdown" sends a signal to shutdown the machine. This
|
||||
might not work if the machine is not listening for that signal.
|
||||
- Brutal solution: in the submenu, select "Stop".
|
||||
- The checkbox "Overrule active shutdown tasks" means that the machine
|
||||
should be stopped even if a shutdown is currently ongoing. This is
|
||||
particularly important if you have tried to shut the machine down
|
||||
normally just before.
|
|
@ -1,113 +0,0 @@
|
|||
#+title: Provisioning VMs via Proxmox
|
||||
|
||||
* Quick links
|
||||
- Proxmox API doc :: https://pve.proxmox.com/pve-docs/api-viewer
|
||||
- Fediversity Proxmox :: http://192.168.51.81:8006/
|
||||
* Basic terminology
|
||||
- Node :: physical host
|
||||
* Fediversity Proxmox
|
||||
- It is only accessible via Procolix's VPN:
|
||||
- Get credentials for the VPN portal and Proxmox from [[https://git.fediversity.eu/kevin][Kevin]].
|
||||
- Log in to the [[https://vpn.fediversity.eu/vpn-user-portal/home][VPN portal]].
|
||||
- Create a *New Configuration*:
|
||||
- Select *WireGuard (UDP)*
|
||||
- Enter some name, e.g. ~fediversity~
|
||||
- Click Download
|
||||
- Write the WireGuard configuration to a file ~fediversity-vpn.config~ next to your NixOS configuration
|
||||
- Add that file's path to ~.git/info/exclude~ and make sure it doesn't otherwise leak (for example, use [[https://github.com/ryantm/agenix][Agenix]] to manage secrets)
|
||||
- To your NixOS configuration, add
|
||||
#+begin_src nix
|
||||
networking.wg-quick.interfaces.fediversity.configFile = toString ./fediversity-vpn.config;
|
||||
#+end_src
|
||||
- Select “Promox VE authentication server”.
|
||||
- Ignore the “You do not have a valid subscription” message.
|
||||
* Automatically
|
||||
This directory contains scripts that can automatically provision or remove a
|
||||
Proxmox VM. For now, they are tied to one node in the Fediversity Proxmox, but
|
||||
it would not be difficult to make them more generic. Try:
|
||||
#+begin_src sh
|
||||
sh proxmox/provision.sh --help
|
||||
sh proxmox/remove.sh --help
|
||||
#+end_src
|
||||
* Preparing the machine configuration
|
||||
- It is nicer if the machine is a QEMU guest. On NixOS:
|
||||
#+begin_src nix
|
||||
services.qemuGuest.enable = true
|
||||
#+end_src
|
||||
- Choose name for your machine.
|
||||
- Choose static IPs for your machine. The IPv4 and IPv6 subnets available for
|
||||
Fediversity testing are:
|
||||
- ~95.215.187.0/24~. Gateway is ~95.215.187.1~.
|
||||
- ~2a00:51c0:13:1305::/64~. Gateway is ~2a00:51c0:13:1305::1~.
|
||||
- I have been using id ~XXX~ (starting from ~001~), name ~fediXXX~, ~95.215.187.XXX~ and
|
||||
~2a00:51c0:13:1305::XXX~.
|
||||
- Name servers should be ~95.215.185.6~ and ~95.215.185.7~.
|
||||
- Check [[https://netbox.protagio.org][Netbox]] to see which addresses are free.
|
||||
* Manually via the GUI
|
||||
** Upload your ISO
|
||||
- Go to Fediversity proxmox.
|
||||
- In the left view, expand under the node that you want and click on “local”.
|
||||
- Select “ISO Images”, then click “Upload”.
|
||||
- Note: You can also download from URL.
|
||||
- Note: You should click on “local” and not “local-zfs”.
|
||||
** Creating the VM
|
||||
- Click “Create VM” at the top right corner.
|
||||
*** General
|
||||
- Node :: which node will host the VM; has to be the same
|
||||
- VM ID :: Has to be unique, probably best to use the "xxxx" in "vm0xxxx" (yet to be decided)
|
||||
- Name :: Usually "vm" + 5 digits, e.g. "vm02199"
|
||||
- Resource pool :: Fediversity
|
||||
*** OS
|
||||
- Use CD/DVD disc image file (iso) ::
|
||||
- Storage :: local, means storage of the node.
|
||||
- ISO image :: select the image previously uploaded
|
||||
No need to touch anything else
|
||||
*** System
|
||||
- BIOS :: OVMF (UEFI)
|
||||
- EFI Storage :: ~linstor_storage~; this is a storage shared by all of the Proxmox machines.
|
||||
- Pre-Enroll keys :: MUST be unchecked
|
||||
- Qemu Agent :: check
|
||||
*** Disks
|
||||
- Tick “advanced” at the bottom.
|
||||
- Disk size (GiB) :: 40 (depending on requirements)
|
||||
- SSD emulation :: check (only visible if “Advanced” is checked)
|
||||
- Discard :: check, so that blocks of removed data are cleared
|
||||
*** CPU
|
||||
- Sockets :: 1 (depending on requirements)
|
||||
- Cores :: 2 (depending on requirements)
|
||||
- Enable NUMA :: check
|
||||
*** Memory
|
||||
- Memory (MiB) :: choose what you want
|
||||
- Ballooning Device :: leave checked (only visible if “Advanced” is checked)
|
||||
*** Network
|
||||
- Bridge :: ~vnet1306~. This is the provisioning bridge; we will change it later.
|
||||
- Firewall :: uncheck, we will handle the firewall on the VM itself
|
||||
*** Confirm
|
||||
** Install and start the VM
|
||||
- Start the VM a first time.
|
||||
- Select the VM in the left panel. You might have to expand the node on which it is hosted.
|
||||
- Select “Console” and start the VM.
|
||||
- Install the VM as you would any other machine.
|
||||
- [[Shutdown the VM]].
|
||||
- After the VM has been installed:
|
||||
- Select the VM again, then go to “Hardware”.
|
||||
- Double click on the CD/DVD Drive line. Select “Do not use any media” and press OK.
|
||||
- Double click on Network Device, and change the bridge to ~vnet1305~, the public bridge.
|
||||
- Start the VM again.
|
||||
** Remove the VM
|
||||
- [[Shutdown the VM]].
|
||||
- On the top right corner, click “More”, then “Remove”.
|
||||
- Enter the ID of the machine.
|
||||
- Check “Purge from job configurations”
|
||||
- Check “Destroy unreferenced disks owned by guest”
|
||||
- Click “Remove”.
|
||||
** Move the VM to another node
|
||||
- Make sure there is no ISO plugged in.
|
||||
- Click on the VM. Click migrate. Choose target node. Go.
|
||||
- Since the storage is shared, it should go pretty fast (~1 minute).
|
||||
** Shutdown the VM
|
||||
- Find the VM in the left panel.
|
||||
- At the top right corner appears a “Shutdown” button with a submenu.
|
||||
- Clicking “Shutdown” sends a signal to shutdown the machine. This might not work if the machine is not listening for that signal.
|
||||
- Brutal solution: in the submenu, select “Stop”.
|
||||
- The checkbox “Overrule active shutdown tasks” means that the machine should be stopped even if a shutdown is currently ongoing. This is particularly important if you have tried to shut the machine down normally just before.
|
21
flake.nix
21
flake.nix
|
@ -41,32 +41,23 @@
|
|||
formatter = pkgs.nixfmt-rfc-style;
|
||||
|
||||
pre-commit.settings.hooks =
|
||||
## Not everybody might want pre-commit hooks, so we make them
|
||||
## opt-in. Maybe one day we will decide to have them everywhere.
|
||||
let
|
||||
inherit (builtins) concatStringsSep;
|
||||
optin = [
|
||||
"deployment"
|
||||
"infra"
|
||||
"keys"
|
||||
"secrets"
|
||||
"services"
|
||||
"panel"
|
||||
];
|
||||
files = "^((" + concatStringsSep "|" optin + ")/.*\\.nix|[^/]*\\.nix)$";
|
||||
## Add a directory here if pre-commit hooks shouldn't apply to it.
|
||||
optout = [ "npins" ];
|
||||
excludes = map (dir: "^${dir}/") optout;
|
||||
in
|
||||
{
|
||||
nixfmt-rfc-style = {
|
||||
enable = true;
|
||||
inherit files;
|
||||
inherit excludes;
|
||||
};
|
||||
deadnix = {
|
||||
enable = true;
|
||||
inherit files;
|
||||
inherit excludes;
|
||||
};
|
||||
trim-trailing-whitespace = {
|
||||
enable = true;
|
||||
inherit files;
|
||||
inherit excludes;
|
||||
};
|
||||
};
|
||||
|
||||
|
|
65
infra/README.md
Normal file
65
infra/README.md
Normal file
|
@ -0,0 +1,65 @@
|
|||
# Infra
|
||||
|
||||
This directory contains the definition of the VMs that host our infrastructure.
|
||||
|
||||
## NixOps4
|
||||
|
||||
Their configuration can be updated via NixOps4. Run
|
||||
|
||||
```sh
|
||||
nixops4 deployments list
|
||||
```
|
||||
|
||||
to see the available deployments.
|
||||
This should be done from the root of the repository,
|
||||
otherwise NixOps4 will fail with something like:
|
||||
|
||||
```
|
||||
nixops4 error: evaluation: error:
|
||||
… while calling the 'getFlake' builtin
|
||||
|
||||
error: path '/nix/store/05nn7krhvi8wkcyl6bsysznlv60g5rrf-source/flake.nix' does not exist, evaluation: error:
|
||||
… while calling the 'getFlake' builtin
|
||||
|
||||
error: path '/nix/store/05nn7krhvi8wkcyl6bsysznlv60g5rrf-source/flake.nix' does not exist
|
||||
```
|
||||
|
||||
Then, given a deployment (eg. `git`), run
|
||||
|
||||
```sh
|
||||
nixops4 apply <deployment>
|
||||
```
|
||||
|
||||
Alternatively, to run the `default` deployment, run
|
||||
|
||||
```sh
|
||||
nixops4 apply
|
||||
```
|
||||
|
||||
## Deployments
|
||||
|
||||
default
|
||||
: Contains everything
|
||||
|
||||
`git`
|
||||
: Machines hosting our Git infrastructure, eg. Forgejo and its actions runners
|
||||
|
||||
`web`
|
||||
: Machines hosting our online content, eg. the website or the wiki
|
||||
|
||||
`other`
|
||||
: Machines without a specific purpose
|
||||
|
||||
## Machines
|
||||
|
||||
These machines are hosted on the Procolix Proxmox instance,
|
||||
to which non-Procolix members of the project do not have access.
|
||||
They host our stable infrastructure.
|
||||
|
||||
Machine Proxmox Description Deployment
|
||||
--------- ------------- ------------------------ ------------
|
||||
vm02116 Procolix Forgejo `git`
|
||||
vm02179 Procolix *unused* `other`
|
||||
vm02186 Procolix *unused* `other`
|
||||
vm02187 Procolix Wiki `web`
|
||||
fedi300 Fediversity Forgejo actions runner `git`
|
|
@ -1,58 +0,0 @@
|
|||
#+title: Infra
|
||||
|
||||
This directory contains the definition of the VMs that host our infrastructure.
|
||||
|
||||
* NixOps4
|
||||
|
||||
Their configuration can be updated via NixOps4. Run
|
||||
|
||||
#+begin_src sh
|
||||
nixops4 deployments list
|
||||
#+end_src
|
||||
|
||||
to see the available deployments. This should be done from the root of the
|
||||
repository, otherwise NixOps4 will fail with something like:
|
||||
|
||||
#+begin_src
|
||||
nixops4 error: evaluation: error:
|
||||
… while calling the 'getFlake' builtin
|
||||
|
||||
error: path '/nix/store/05nn7krhvi8wkcyl6bsysznlv60g5rrf-source/flake.nix' does not exist, evaluation: error:
|
||||
… while calling the 'getFlake' builtin
|
||||
|
||||
error: path '/nix/store/05nn7krhvi8wkcyl6bsysznlv60g5rrf-source/flake.nix' does not exist
|
||||
#+end_src
|
||||
|
||||
Then, given a deployment (eg. ~git~), run
|
||||
|
||||
#+begin_src sh
|
||||
nixops4 apply <deployment>
|
||||
#+end_src
|
||||
|
||||
Alternatively, to run the ~default~ deployment, run
|
||||
|
||||
#+begin_src sh
|
||||
nixops4 apply
|
||||
#+end_src
|
||||
|
||||
* Deployments
|
||||
|
||||
- default :: Contains everything
|
||||
- ~git~ :: Machines hosting our Git infrastructure, eg. Forgejo and its actions
|
||||
runners
|
||||
- ~web~ :: Machines hosting our online content, eg. the website or the wiki
|
||||
- ~other~ :: Machines without a specific purpose
|
||||
|
||||
* Machines
|
||||
|
||||
These machines are hosted on the Procolix Proxmox instance, to which
|
||||
non-Procolix members of the project do not have access. They host our stable
|
||||
infrastructure.
|
||||
|
||||
| Machine | Proxmox | Description | Deployment |
|
||||
|---------+-------------+------------------------+------------|
|
||||
| vm02116 | Procolix | Forgejo | ~git~ |
|
||||
| vm02179 | Procolix | /unused/ | ~other~ |
|
||||
| vm02186 | Procolix | /unused/ | ~other~ |
|
||||
| vm02187 | Procolix | Wiki | ~web~ |
|
||||
| fedi300 | Fediversity | Forgejo actions runner | ~git~ |
|
|
@ -26,7 +26,7 @@ lib.mapAttrs (name: test: pkgs.testers.runNixOSTest (test // { inherit name; }))
|
|||
# run all application-level tests managed by Django
|
||||
# https://docs.djangoproject.com/en/5.0/topics/testing/overview/
|
||||
testScript = ''
|
||||
server.succeed("manage test")
|
||||
server.succeed("manage test ${name}")
|
||||
'';
|
||||
};
|
||||
admin = {
|
||||
|
|
BIN
panel/src/panel/static/favicon.ico
Normal file
BIN
panel/src/panel/static/favicon.ico
Normal file
Binary file not shown.
After Width: | Height: | Size: 22 KiB |
|
@ -30,7 +30,7 @@
|
|||
{% load custom_tags %}
|
||||
<li>
|
||||
{% if user.is_authenticated %}
|
||||
Welcome, {{ user.username }}! <a id="logout" href="{% auth_url 'logout' %}">Logout</a>
|
||||
Welcome, <a href="{% url 'account_detail' %}">{{ user.username }}</a>! <a id="logout" href="{% auth_url 'logout' %}">Logout</a>
|
||||
{% else %}
|
||||
<a id="login" href="{% auth_url 'login' %}">Login</a>
|
||||
{% endif %}
|
||||
|
|
0
panel/src/panel/tests/__init__.py
Normal file
0
panel/src/panel/tests/__init__.py
Normal file
104
panel/src/panel/tests/test_user_stories.py
Normal file
104
panel/src/panel/tests/test_user_stories.py
Normal file
|
@ -0,0 +1,104 @@
|
|||
from django.test import TestCase
|
||||
from django.urls import reverse
|
||||
from django.contrib.auth.models import User
|
||||
from django.template import Template, Context
|
||||
from urllib.parse import unquote
|
||||
|
||||
class Login(TestCase):
|
||||
def setUp(self):
|
||||
self.username = 'testuser'
|
||||
self.password = 'securepassword123'
|
||||
self.user = User.objects.create_user(
|
||||
username=self.username,
|
||||
email='test@example.com',
|
||||
password=self.password
|
||||
)
|
||||
|
||||
self.login = reverse('login')
|
||||
self.logout = reverse('logout')
|
||||
self.required_login = reverse('account_detail')
|
||||
self.optional_login = reverse('service_list')
|
||||
|
||||
def test_optional_login_redirects_back_to_original_page(self):
|
||||
# go to a view where authentication is optional
|
||||
response = self.client.get(self.optional_login)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertFalse(response.context['user'].is_authenticated)
|
||||
|
||||
# check that the expected login URL is in the response
|
||||
context = response.context[0]
|
||||
template = Template("{% load custom_tags %}{% auth_url 'login' %}")
|
||||
login_url = template.render(context)
|
||||
self.assertIn(login_url, response.content.decode('utf-8'))
|
||||
|
||||
# log in
|
||||
response = self.client.get(login_url)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
login_data = {
|
||||
'username': self.username,
|
||||
'password': self.password,
|
||||
}
|
||||
response = self.client.post(login_url, login_data, follow=True)
|
||||
|
||||
# check that we're back at the desired view and authenticated
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(response.context['user'].is_authenticated)
|
||||
location, status = response.redirect_chain[-1]
|
||||
self.assertEqual(location, self.optional_login)
|
||||
|
||||
# check that the expected logout URL is present
|
||||
context = response.context[0]
|
||||
template = Template("{% load custom_tags %}{% auth_url 'logout' %}")
|
||||
logout_url = template.render(context)
|
||||
self.assertIn(logout_url, response.content.decode('utf-8'))
|
||||
|
||||
# log out again
|
||||
response = self.client.get(logout_url, follow=True)
|
||||
|
||||
# check that we're back at the view and logged out
|
||||
self.assertEqual(response.status_code, 200)
|
||||
location, status = response.redirect_chain[-1]
|
||||
self.assertEqual(location, self.optional_login)
|
||||
self.assertFalse(response.context['user'].is_authenticated)
|
||||
|
||||
def test_required_login_redirects_back_login(self):
|
||||
# go to a view that requires authentication
|
||||
response = self.client.get(self.required_login)
|
||||
|
||||
# check that we're redirected to the login view
|
||||
self.assertEqual(response.status_code, 302)
|
||||
redirect = response.url
|
||||
self.assertTrue(redirect.startswith(self.login))
|
||||
|
||||
# log in
|
||||
response = self.client.get(redirect)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
login_data = {
|
||||
'username': self.username,
|
||||
'password': self.password,
|
||||
}
|
||||
response = self.client.post(redirect, login_data, follow=True)
|
||||
|
||||
# check that we reached the desired view, authenticated
|
||||
self.assertEqual(response.status_code, 200)
|
||||
location, status = response.redirect_chain[-1]
|
||||
self.assertEqual(location, self.required_login)
|
||||
self.assertTrue(response.context['user'].is_authenticated)
|
||||
|
||||
# check that the expected logout URL is present
|
||||
context = response.context[0]
|
||||
template = Template("{% load custom_tags %}{% auth_url 'logout' %}")
|
||||
logout_url = template.render(context)
|
||||
self.assertIn(logout_url, response.content.decode('utf-8'))
|
||||
|
||||
# log out
|
||||
response = self.client.get(logout_url, follow=True)
|
||||
|
||||
# check that we're at the expected location, logged out
|
||||
self.assertEqual(response.status_code, 200)
|
||||
template = Template("{% load custom_tags %}{% auth_url 'login' %}")
|
||||
login_url = template.render(context)
|
||||
location, status = response.redirect_chain[-1]
|
||||
self.assertEqual(location, unquote(login_url))
|
||||
self.assertFalse(response.context['user'].is_authenticated)
|
|
@ -9,7 +9,9 @@ in
|
|||
collections.news.type = cfg.content-types.article;
|
||||
collections.events.type = cfg.content-types.event;
|
||||
|
||||
pages.index = { config, link, ... }: {
|
||||
pages.index =
|
||||
{ config, link, ... }:
|
||||
{
|
||||
title = "Welcome to the Fediversity project";
|
||||
description = "Fediversity web site";
|
||||
summary = ''
|
||||
|
@ -20,7 +22,8 @@ in
|
|||
|
||||
[Learn more about Fediversity](${link pages.fediversity})
|
||||
'';
|
||||
outputs.html = (cfg.templates.html.page config).override (final: prev: {
|
||||
outputs.html = (cfg.templates.html.page config).override (
|
||||
_final: prev: {
|
||||
html = {
|
||||
head.title.text = "Fediversity";
|
||||
head.link.stylesheets = prev.html.head.link.stylesheets ++ [
|
||||
|
@ -28,7 +31,13 @@ in
|
|||
];
|
||||
body.content =
|
||||
let
|
||||
to-section = { heading, body, attrs ? { } }: {
|
||||
to-section =
|
||||
{
|
||||
heading,
|
||||
body,
|
||||
attrs ? { },
|
||||
}:
|
||||
{
|
||||
section = {
|
||||
heading.content = heading;
|
||||
inherit attrs;
|
||||
|
@ -47,11 +56,11 @@ in
|
|||
section = {
|
||||
attrs = { };
|
||||
heading.content = config.title;
|
||||
content = [
|
||||
content =
|
||||
[
|
||||
(cfg.templates.html.markdown { inherit (config) name body; })
|
||||
]
|
||||
++
|
||||
(map to-section [
|
||||
++ (map to-section [
|
||||
{
|
||||
heading = "Fediversity grants";
|
||||
body = ''
|
||||
|
@ -65,68 +74,95 @@ in
|
|||
body = ''
|
||||
The Consortium behind the Fediversity project is a cooperation between NLnet, Open Internet Discourse Foundation, NORDUnet and Tweag.
|
||||
|
||||
${toString (map (partner: ''
|
||||
${toString (
|
||||
map
|
||||
(partner: ''
|
||||
### ${partner.title}
|
||||
|
||||
${partner.summary}
|
||||
|
||||
[Read more about ${partner.title}](${link partner})
|
||||
'') (with pages; [ nlnet oid tweag nordunet ]))}
|
||||
'')
|
||||
(
|
||||
with pages;
|
||||
[
|
||||
nlnet
|
||||
oid
|
||||
tweag
|
||||
nordunet
|
||||
]
|
||||
)
|
||||
)}
|
||||
'';
|
||||
}
|
||||
{
|
||||
heading = "Fediverse explained";
|
||||
body = ''
|
||||
${toString (map (role: ''
|
||||
${toString (
|
||||
map
|
||||
(role: ''
|
||||
### ${role.title}
|
||||
|
||||
${role.summary}
|
||||
|
||||
[Read more about ${role.title}](${link role})
|
||||
'') (with pages; [ individuals developers european-commission ]))}
|
||||
'')
|
||||
(
|
||||
with pages;
|
||||
[
|
||||
individuals
|
||||
developers
|
||||
european-commission
|
||||
]
|
||||
)
|
||||
)}
|
||||
'';
|
||||
}
|
||||
]);
|
||||
};
|
||||
}
|
||||
]
|
||||
++
|
||||
(map to-section [
|
||||
++ (map to-section [
|
||||
{
|
||||
heading = "News";
|
||||
attrs = { class = [ "collection" ]; };
|
||||
attrs = {
|
||||
class = [ "collection" ];
|
||||
};
|
||||
body =
|
||||
let
|
||||
sorted = with lib; reverseList (sortOn (entry: entry.date) cfg.collections.news.entry);
|
||||
in
|
||||
lib.join "\n" (map
|
||||
(article: ''
|
||||
lib.join "\n" (
|
||||
map (article: ''
|
||||
- ${article.date} [${article.title}](${link article})
|
||||
'')
|
||||
sorted);
|
||||
'') sorted
|
||||
);
|
||||
}
|
||||
{
|
||||
heading = "Events";
|
||||
attrs = { class = [ "collection" ]; };
|
||||
attrs = {
|
||||
class = [ "collection" ];
|
||||
};
|
||||
body =
|
||||
let
|
||||
sorted = with lib; reverseList (sortOn (entry: entry.start-date) cfg.collections.events.entry);
|
||||
in
|
||||
lib.join "\n" (map
|
||||
(article: ''
|
||||
lib.join "\n" (
|
||||
map (article: ''
|
||||
- ${article.start-date} [${article.title}](${link article})
|
||||
'')
|
||||
sorted);
|
||||
'') sorted
|
||||
);
|
||||
}
|
||||
]);
|
||||
};
|
||||
|
||||
});
|
||||
}
|
||||
);
|
||||
};
|
||||
|
||||
assets."index.css".path = with lib; builtins.toFile
|
||||
"index.css"
|
||||
''
|
||||
assets."index.css".path =
|
||||
with lib;
|
||||
builtins.toFile "index.css" ''
|
||||
section h1, section h2, section h3
|
||||
{
|
||||
text-align: center;
|
||||
|
|
|
@ -1,19 +1,23 @@
|
|||
{ config, lib, ... }:
|
||||
{
|
||||
pages.events = { link, ... }: rec {
|
||||
pages.events =
|
||||
{ link, ... }:
|
||||
rec {
|
||||
title = "Events";
|
||||
description = "Events related to the Fediverse and NixOS";
|
||||
summary = description;
|
||||
body =
|
||||
with lib;
|
||||
let
|
||||
events = map
|
||||
(event: with lib; ''
|
||||
events = map (
|
||||
event: with lib; ''
|
||||
## [${event.title}](${link event})
|
||||
|
||||
${event.start-date} ${optionalString (!isNull event.end-date && event.end-date != event.start-date) "to ${event.end-date}"} in ${event.location}
|
||||
'')
|
||||
config.collections.events.entry;
|
||||
${event.start-date} ${
|
||||
optionalString (!isNull event.end-date && event.end-date != event.start-date) "to ${event.end-date}"
|
||||
} in ${event.location}
|
||||
''
|
||||
) config.collections.events.entry;
|
||||
in
|
||||
''
|
||||
${join "\n" events}
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
{ config, lib, ... }:
|
||||
{ config, ... }:
|
||||
{
|
||||
collections.events.entry = { link, ... }: {
|
||||
collections.events.entry =
|
||||
{ link, ... }:
|
||||
{
|
||||
title = "NixOS 24.11 ZHF hackathon";
|
||||
name = "zhf-24-11";
|
||||
description = "NixOS 24.11 ZHF hackathon in Zürich";
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
{ ... }:
|
||||
{
|
||||
collections.events.entry = { ... }: {
|
||||
collections.events.entry =
|
||||
{ ... }:
|
||||
{
|
||||
title = "OW2con 2024";
|
||||
description = "OW2con is the annual European open source conference in Paris";
|
||||
start-date = "2024-06-11";
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
{ ... }:
|
||||
{
|
||||
collections.events.entry = { ... }: {
|
||||
collections.events.entry =
|
||||
{ ... }:
|
||||
{
|
||||
title = "PublicSpaces Conference 2024";
|
||||
description = "A conference by PublicSpaces, Taking Back the Internet.";
|
||||
start-date = "2024-06-06";
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
{ ... }:
|
||||
{
|
||||
collections.events.entry = { ... }: {
|
||||
collections.events.entry =
|
||||
{ ... }:
|
||||
{
|
||||
title = "State of the Internet 2024";
|
||||
description = "The State of the Internet 2024 by Waag";
|
||||
start-date = "2024-05-16";
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
{ config, lib, ... }:
|
||||
{ config, ... }:
|
||||
let
|
||||
inherit (config) pages;
|
||||
in
|
||||
|
@ -6,16 +6,33 @@ in
|
|||
menus.main = {
|
||||
label = "Main";
|
||||
items = [
|
||||
{ page = pages.index // { title = "Start"; }; }
|
||||
{
|
||||
page = pages.index // {
|
||||
title = "Start";
|
||||
};
|
||||
}
|
||||
{
|
||||
menu.label = "For you";
|
||||
menu.items = map (page: { inherit page; })
|
||||
(with pages; [ individuals developers european-commission ]);
|
||||
menu.items = map (page: { inherit page; }) (
|
||||
with pages;
|
||||
[
|
||||
individuals
|
||||
developers
|
||||
european-commission
|
||||
]
|
||||
);
|
||||
}
|
||||
{
|
||||
menu.label = "Consortium";
|
||||
menu.items = map (page: { inherit page; })
|
||||
(with pages; [ nlnet oid tweag nordunet ]);
|
||||
menu.items = map (page: { inherit page; }) (
|
||||
with pages;
|
||||
[
|
||||
nlnet
|
||||
oid
|
||||
tweag
|
||||
nordunet
|
||||
]
|
||||
);
|
||||
}
|
||||
{ page = pages.fediversity; }
|
||||
{ page = pages.grants; }
|
||||
|
|
|
@ -1,21 +1,21 @@
|
|||
{ config, lib, ... }:
|
||||
{
|
||||
pages.news = { link, ... }: rec {
|
||||
pages.news =
|
||||
{ link, ... }:
|
||||
rec {
|
||||
title = "News";
|
||||
description = "News about Fediversity";
|
||||
summary = description;
|
||||
body =
|
||||
with lib;
|
||||
let
|
||||
news = map
|
||||
(article: ''
|
||||
news = map (article: ''
|
||||
## [${article.title}](${link article})
|
||||
|
||||
${article.date} by ${article.author}
|
||||
|
||||
${article.summary}
|
||||
'')
|
||||
config.collections.news.entry;
|
||||
'') config.collections.news.entry;
|
||||
in
|
||||
''
|
||||
${join "\n\n" news}
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
{ config, lib, ... }:
|
||||
{ config, ... }:
|
||||
{
|
||||
collections.news.entry = { link, ... }: rec {
|
||||
collections.news.entry =
|
||||
{ link, ... }:
|
||||
rec {
|
||||
name = "zhf-24-11";
|
||||
title = "NixOS 24.11 release hackathon and workshop";
|
||||
description = "Fediversity engineers met in Zürich at a NixOS 24.11 ZHF hackathon";
|
||||
|
|
|
@ -1,6 +1,9 @@
|
|||
{ config, lib, ... }:
|
||||
{ ... }:
|
||||
|
||||
{
|
||||
collections.news.entry = { link, ... }: {
|
||||
collections.news.entry =
|
||||
{ ... }:
|
||||
{
|
||||
title = "Fediversity project publicly announced";
|
||||
description = "The Fediversity project has officially been announced";
|
||||
date = "2024-01-01";
|
||||
|
|
|
@ -1,14 +1,16 @@
|
|||
{ sources ? import ../npins
|
||||
, system ? builtins.currentSystem
|
||||
, pkgs ? import sources.nixpkgs {
|
||||
{
|
||||
sources ? import ../npins,
|
||||
system ? builtins.currentSystem,
|
||||
pkgs ? import sources.nixpkgs {
|
||||
inherit system;
|
||||
config = { };
|
||||
overlays = [ ];
|
||||
}
|
||||
, lib ? import "${sources.nixpkgs}/lib"
|
||||
},
|
||||
lib ? import "${sources.nixpkgs}/lib",
|
||||
}:
|
||||
let
|
||||
lib' = final: prev:
|
||||
lib' =
|
||||
final: prev:
|
||||
let
|
||||
new = import ./lib.nix { lib = final; };
|
||||
in
|
||||
|
@ -37,13 +39,19 @@ rec {
|
|||
let
|
||||
run-tests = pkgs.writeShellApplication {
|
||||
name = "run-tests";
|
||||
text = with pkgs; with lib; ''
|
||||
text =
|
||||
with pkgs;
|
||||
with lib;
|
||||
''
|
||||
${getExe nix-unit} ${toString ./tests.nix} "$@"
|
||||
'';
|
||||
};
|
||||
test-loop = pkgs.writeShellApplication {
|
||||
name = "test-loop";
|
||||
text = with pkgs; with lib; ''
|
||||
text =
|
||||
with pkgs;
|
||||
with lib;
|
||||
''
|
||||
${getExe watchexec} -w ${toString ./.} -- ${getExe nix-unit} ${toString ./tests.nix}
|
||||
'';
|
||||
};
|
||||
|
@ -62,7 +70,9 @@ rec {
|
|||
};
|
||||
|
||||
inherit sources pkgs;
|
||||
tests = with pkgs; with lib;
|
||||
tests =
|
||||
with pkgs;
|
||||
with lib;
|
||||
let
|
||||
source = fileset.toSource {
|
||||
root = ../.;
|
||||
|
|
167
website/lib.nix
167
website/lib.nix
|
@ -1,22 +1,26 @@
|
|||
{ lib }:
|
||||
rec {
|
||||
template = g: f: x:
|
||||
template =
|
||||
g: f: x:
|
||||
let
|
||||
base = f x;
|
||||
result = g base;
|
||||
in
|
||||
result // {
|
||||
override = new:
|
||||
result
|
||||
// {
|
||||
override =
|
||||
new:
|
||||
let
|
||||
base' =
|
||||
if lib.isFunction new
|
||||
then lib.recursiveUpdate base (new base' base)
|
||||
if lib.isFunction new then
|
||||
lib.recursiveUpdate base (new base' base)
|
||||
else
|
||||
lib.recursiveUpdate base new;
|
||||
result' = g base';
|
||||
in
|
||||
result' // {
|
||||
override = new: (template g (x': base') x).override new;
|
||||
result'
|
||||
// {
|
||||
override = new: (template g (_: base') x).override new;
|
||||
};
|
||||
};
|
||||
|
||||
|
@ -28,7 +32,8 @@ rec {
|
|||
replaceStringRec "--" "-" "hello-----world"
|
||||
=> "hello-world"
|
||||
*/
|
||||
replaceStringsRec = from: to: string:
|
||||
replaceStringsRec =
|
||||
from: to: string:
|
||||
let
|
||||
replaced = lib.replaceStrings [ from ] [ to ] string;
|
||||
in
|
||||
|
@ -37,25 +42,24 @@ rec {
|
|||
/**
|
||||
Create a URL-safe slug from any string
|
||||
*/
|
||||
slug = str:
|
||||
slug =
|
||||
str:
|
||||
let
|
||||
# Replace non-alphanumeric characters with hyphens
|
||||
replaced = join ""
|
||||
(
|
||||
builtins.map
|
||||
(c:
|
||||
if (c >= "a" && c <= "z") || (c >= "0" && c <= "9")
|
||||
then c
|
||||
else "-"
|
||||
replaced = join "" (
|
||||
builtins.map (c: if (c >= "a" && c <= "z") || (c >= "0" && c <= "9") then c else "-") (
|
||||
with lib; stringToCharacters (toLower str)
|
||||
)
|
||||
(with lib; stringToCharacters (toLower str)));
|
||||
);
|
||||
|
||||
# Remove leading and trailing hyphens
|
||||
trimHyphens = s:
|
||||
trimHyphens =
|
||||
s:
|
||||
let
|
||||
matched = builtins.match "(-*)([^-].*[^-]|[^-])(-*)" s;
|
||||
in
|
||||
with lib; optionalString (!isNull matched) (builtins.elemAt matched 1);
|
||||
with lib;
|
||||
optionalString (!isNull matched) (builtins.elemAt matched 1);
|
||||
in
|
||||
trimHyphens (replaceStringsRec "--" "-" replaced);
|
||||
|
||||
|
@ -64,9 +68,11 @@ rec {
|
|||
/**
|
||||
Trim trailing spaces and squash non-leading spaces
|
||||
*/
|
||||
trim = string:
|
||||
trim =
|
||||
string:
|
||||
let
|
||||
trimLine = line:
|
||||
trimLine =
|
||||
line:
|
||||
with lib;
|
||||
let
|
||||
# separate leading spaces from the rest
|
||||
|
@ -76,8 +82,7 @@ rec {
|
|||
# drop trailing spaces
|
||||
body = head (split " *$" rest);
|
||||
in
|
||||
if body == "" then "" else
|
||||
spaces + replaceStringsRec " " " " body;
|
||||
if body == "" then "" else spaces + replaceStringsRec " " " " body;
|
||||
in
|
||||
join "\n" (map trimLine (splitLines string));
|
||||
|
||||
|
@ -85,35 +90,42 @@ rec {
|
|||
|
||||
splitLines = s: with builtins; filter (x: !isList x) (split "\n" s);
|
||||
|
||||
indent = prefix: s:
|
||||
indent =
|
||||
prefix: s:
|
||||
with lib.lists;
|
||||
let
|
||||
lines = splitLines s;
|
||||
in
|
||||
join "\n" (
|
||||
[ (head lines) ]
|
||||
++
|
||||
(map (x: if x == "" then x else "${prefix}${x}") (tail lines))
|
||||
);
|
||||
join "\n" ([ (head lines) ] ++ (map (x: if x == "" then x else "${prefix}${x}") (tail lines)));
|
||||
|
||||
relativePath = path1': path2':
|
||||
relativePath =
|
||||
path1': path2':
|
||||
let
|
||||
inherit (lib.path) subpath;
|
||||
inherit (lib) lists length take drop min max;
|
||||
inherit (lib)
|
||||
lists
|
||||
length
|
||||
take
|
||||
drop
|
||||
min
|
||||
max
|
||||
;
|
||||
|
||||
path1 = subpath.components path1';
|
||||
prefix1 = take (length path1 - 1) path1;
|
||||
path2 = subpath.components path2';
|
||||
prefix2 = take (length path2 - 1) path2;
|
||||
|
||||
commonPrefixLength = with lists;
|
||||
findFirstIndex (i: i.fst != i.snd)
|
||||
(min (length prefix1) (length prefix2))
|
||||
(zipLists prefix1 prefix2);
|
||||
commonPrefixLength =
|
||||
with lists;
|
||||
findFirstIndex (i: i.fst != i.snd) (min (length prefix1) (length prefix2)) (
|
||||
zipLists prefix1 prefix2
|
||||
);
|
||||
|
||||
depth = max 0 (length prefix1 - commonPrefixLength);
|
||||
|
||||
relativeComponents = with lists;
|
||||
relativeComponents =
|
||||
with lists;
|
||||
[ "." ] ++ (replicate depth "..") ++ (drop commonPrefixLength path2);
|
||||
in
|
||||
join "/" relativeComponents;
|
||||
|
@ -122,47 +134,51 @@ rec {
|
|||
Recursively list all Nix files from a directory, except the top-level `default.nix`
|
||||
|
||||
Useful for module system `imports` from a top-level module.
|
||||
**/
|
||||
nixFiles = dir: with lib.fileset;
|
||||
toList (difference
|
||||
(fileFilter ({ hasExt, ... }: hasExt "nix") dir)
|
||||
(dir + "/default.nix")
|
||||
);
|
||||
*
|
||||
*/
|
||||
nixFiles =
|
||||
dir:
|
||||
with lib.fileset;
|
||||
toList (difference (fileFilter ({ hasExt, ... }: hasExt "nix") dir) (dir + "/default.nix"));
|
||||
|
||||
types = rec {
|
||||
# arbitrarily nested attribute set where the leaves are of type `type`
|
||||
# NOTE: this works for anything but attribute sets!
|
||||
recursiveAttrs = type: with lib.types;
|
||||
recursiveAttrs =
|
||||
type:
|
||||
with lib.types;
|
||||
# NOTE: due to how `either` works, the first match is significant,
|
||||
# so if `type` happens to be an attrset, the typecheck will consider
|
||||
# `type`, not `attrsOf`
|
||||
attrsOf (either type (recursiveAttrs type));
|
||||
|
||||
# collection of unnamed items that can be added to item-wise, i.e. without wrapping the item in a list
|
||||
collection = elemType:
|
||||
collection =
|
||||
elemType:
|
||||
let
|
||||
unparenthesize = class: class == "noun";
|
||||
desc = type:
|
||||
types.optionDescriptionPhrase unparenthesize type;
|
||||
desc' = type:
|
||||
desc = type: types.optionDescriptionPhrase unparenthesize type;
|
||||
desc' =
|
||||
type:
|
||||
let
|
||||
typeDesc = lib.types.optionDescriptionPhrase unparenthesize type;
|
||||
in
|
||||
if type.descriptionClass == "noun"
|
||||
then
|
||||
typeDesc + "s"
|
||||
else
|
||||
"many instances of ${typeDesc}";
|
||||
if type.descriptionClass == "noun" then typeDesc + "s" else "many instances of ${typeDesc}";
|
||||
in
|
||||
lib.types.mkOptionType {
|
||||
name = "collection";
|
||||
description = "separately specified ${desc elemType} for a collection of ${desc' elemType}";
|
||||
merge = loc: defs:
|
||||
map
|
||||
(def:
|
||||
elemType.merge (loc ++ [ "[definition ${toString def.file}]" ]) [{ inherit (def) file; value = def.value; }]
|
||||
)
|
||||
defs;
|
||||
merge =
|
||||
loc: defs:
|
||||
map (
|
||||
def:
|
||||
elemType.merge (loc ++ [ "[definition ${toString def.file}]" ]) [
|
||||
{
|
||||
inherit (def) file;
|
||||
value = def.value;
|
||||
}
|
||||
]
|
||||
) defs;
|
||||
check = elemType.check;
|
||||
getSubOptions = elemType.getSubOptions;
|
||||
getSubModules = elemType.getSubModules;
|
||||
|
@ -175,29 +191,34 @@ rec {
|
|||
nestedTypes.elemType = elemType;
|
||||
};
|
||||
|
||||
listOfUnique = elemType:
|
||||
listOfUnique =
|
||||
elemType:
|
||||
let
|
||||
baseType = lib.types.listOf elemType;
|
||||
in
|
||||
baseType // {
|
||||
merge = loc: defs:
|
||||
baseType
|
||||
// {
|
||||
merge =
|
||||
loc: defs:
|
||||
let
|
||||
# Keep track of which definition each value came from
|
||||
defsWithValues = map
|
||||
(def:
|
||||
map (v: { inherit (def) file; value = v; }) def.value
|
||||
)
|
||||
defs;
|
||||
defsWithValues = map (
|
||||
def:
|
||||
map (v: {
|
||||
inherit (def) file;
|
||||
value = v;
|
||||
}) def.value
|
||||
) defs;
|
||||
flatDefs = lib.flatten defsWithValues;
|
||||
|
||||
# Check for duplicates while preserving source info
|
||||
seen = builtins.foldl'
|
||||
(acc: def:
|
||||
if lib.lists.any (v: v.value == def.value) acc
|
||||
then throw "The option `${lib.options.showOption loc}` has duplicate values (${toString def.value}) defined in ${def.file}"
|
||||
else acc ++ [ def ]
|
||||
) [ ]
|
||||
flatDefs;
|
||||
seen = builtins.foldl' (
|
||||
acc: def:
|
||||
if lib.lists.any (v: v.value == def.value) acc then
|
||||
throw "The option `${lib.options.showOption loc}` has duplicate values (${toString def.value}) defined in ${def.file}"
|
||||
else
|
||||
acc ++ [ def ]
|
||||
) [ ] flatDefs;
|
||||
in
|
||||
map (def: def.value) seen;
|
||||
};
|
||||
|
|
|
@ -1,4 +1,10 @@
|
|||
{ config, options, lib, pkgs, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
|
@ -8,8 +14,7 @@ in
|
|||
{
|
||||
imports = lib.nixFiles ./.;
|
||||
|
||||
options.templates =
|
||||
mkOption {
|
||||
options.templates = mkOption {
|
||||
description = ''
|
||||
Collection of named helper functions for conversion different structured representations which can be rendered to a string
|
||||
'';
|
||||
|
@ -32,32 +37,35 @@ in
|
|||
type = types.package;
|
||||
default =
|
||||
let
|
||||
script = ''
|
||||
script =
|
||||
''
|
||||
mkdir $out
|
||||
'' + lib.join "\n" copy;
|
||||
copy = lib.mapAttrsToList
|
||||
(
|
||||
path: file: ''
|
||||
''
|
||||
+ lib.join "\n" copy;
|
||||
copy = lib.mapAttrsToList (path: file: ''
|
||||
mkdir -p $out/$(dirname ${path})
|
||||
cp -r ${file} $out/${path}
|
||||
''
|
||||
)
|
||||
config.files;
|
||||
'') config.files;
|
||||
in
|
||||
pkgs.runCommand "source" { } script;
|
||||
};
|
||||
|
||||
# TODO: this is an artefact of exploration; needs to be adapted to actual use
|
||||
config.templates.table-of-contents = { config, ... }:
|
||||
config.templates.table-of-contents =
|
||||
{ config, ... }:
|
||||
let
|
||||
outline = { ... }: {
|
||||
outline =
|
||||
{ ... }:
|
||||
{
|
||||
options = {
|
||||
value = mkOption {
|
||||
# null denotes root
|
||||
type = with types; nullOr (either str (listOf (attrTag categories.phrasing)));
|
||||
subsections = mkOption {
|
||||
type = with types; listOf (submodule outline);
|
||||
default = with lib; map
|
||||
default =
|
||||
with lib;
|
||||
map
|
||||
# TODO: go into depth manually here,
|
||||
# we don't want to pollute the DOM implementation
|
||||
(c: (lib.head (attrValues c)).outline)
|
||||
|
@ -67,10 +75,11 @@ in
|
|||
__toString = mkOption {
|
||||
type = with types; functionTo str;
|
||||
# TODO: convert to HTML
|
||||
default = self: lib.squash ''
|
||||
default =
|
||||
self:
|
||||
lib.squash ''
|
||||
${if isNull self.value then "root" else self.value}
|
||||
${if self.subsections != [] then
|
||||
" " + lib.indent " " (lib.join "\n" self.subsections) else ""}
|
||||
${if self.subsections != [ ] then " " + lib.indent " " (lib.join "\n" self.subsections) else ""}
|
||||
'';
|
||||
};
|
||||
};
|
||||
|
@ -81,9 +90,11 @@ in
|
|||
type = types.submodule outline;
|
||||
default = {
|
||||
value = null;
|
||||
subsections = with lib;
|
||||
map (c: (lib.head (attrValues c)).outline)
|
||||
(filter (c: isAttrs c && (lib.head (attrValues c)) ? outline) config.content);
|
||||
subsections =
|
||||
with lib;
|
||||
map (c: (lib.head (attrValues c)).outline) (
|
||||
filter (c: isAttrs c && (lib.head (attrValues c)) ? outline) config.content
|
||||
);
|
||||
};
|
||||
};
|
||||
};
|
||||
|
|
|
@ -6,8 +6,8 @@
|
|||
Similar work from the OCaml ecosystem: https://github.com/ocsigen/tyxml
|
||||
*/
|
||||
{ config, lib, ... }:
|
||||
|
||||
let
|
||||
cfg = config;
|
||||
inherit (lib) mkOption types;
|
||||
inherit (types) submodule;
|
||||
|
||||
|
@ -28,7 +28,9 @@ let
|
|||
];
|
||||
|
||||
# base type for all DOM elements
|
||||
element = { ... }: {
|
||||
element =
|
||||
{ ... }:
|
||||
{
|
||||
# TODO: add fields for upstream documentation references
|
||||
# TODO: programmatically generate documentation
|
||||
options = with lib; {
|
||||
|
@ -43,23 +45,31 @@ let
|
|||
};
|
||||
|
||||
# options with types for all the defined DOM elements
|
||||
element-types = lib.mapAttrs
|
||||
(name: value: mkOption { type = submodule value; })
|
||||
elements;
|
||||
element-types = lib.mapAttrs (_name: value: mkOption { type = submodule value; }) elements;
|
||||
|
||||
# attrset of categories, where values are module options with the type of the
|
||||
# elements that belong to these categories
|
||||
categories = with lib;
|
||||
genAttrs
|
||||
content-categories
|
||||
(category:
|
||||
categories =
|
||||
with lib;
|
||||
genAttrs content-categories (
|
||||
category:
|
||||
(mapAttrs (_: e: mkOption { type = submodule e; })
|
||||
# HACK: don't evaluate the submodule types, just grab the config directly
|
||||
# TODO: we may want to do this properly and loop `categories` through the top-level `config`
|
||||
(filterAttrs (_: e: elem category (e { name = "dummy"; config = { }; }).config.categories) elements))
|
||||
(
|
||||
filterAttrs (
|
||||
_: e:
|
||||
elem category
|
||||
(e {
|
||||
name = "dummy";
|
||||
config = { };
|
||||
}).config.categories
|
||||
) elements
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
global-attrs = lib.mapAttrs (name: value: mkOption value) {
|
||||
global-attrs = lib.mapAttrs (_name: value: mkOption value) {
|
||||
class = {
|
||||
type = with types; listOf nonEmptyStr;
|
||||
default = [ ];
|
||||
|
@ -95,7 +105,7 @@ let
|
|||
|
||||
# all possible attributes to `<link>` elements.
|
||||
# since not all of them apply to each `rel=` type, the separate implementations can pick from this collection
|
||||
link-attrs = lib.mapAttrs (name: value: mkOption value) {
|
||||
link-attrs = lib.mapAttrs (_name: value: mkOption value) {
|
||||
href = {
|
||||
# TODO: implement https://html.spec.whatwg.org/multipage/semantics.html#the-link-element:attr-link-href-3
|
||||
# TODO: https://url.spec.whatwg.org/#valid-url-string
|
||||
|
@ -120,7 +130,7 @@ let
|
|||
|
||||
# TODO: not sure where to put these, since so far they apply to multiple elements,
|
||||
# but have the same properties for all of them
|
||||
attrs = lib.mapAttrs (name: value: mkOption value) {
|
||||
attrs = lib.mapAttrs (_name: value: mkOption value) {
|
||||
# TODO: investigate: `href` may be coupled with other attributes such as `target` or `hreflang`, this could simplify things
|
||||
href = {
|
||||
# TODO: https://url.spec.whatwg.org/#valid-url-string
|
||||
|
@ -131,7 +141,8 @@ let
|
|||
# https://html.spec.whatwg.org/multipage/document-sequences.html#valid-navigable-target-name-or-keyword
|
||||
type =
|
||||
let
|
||||
is-valid-target = s:
|
||||
is-valid-target =
|
||||
s:
|
||||
let
|
||||
inherit (lib) match;
|
||||
has-lt = s: match ".*<.*" s != null;
|
||||
|
@ -140,14 +151,19 @@ let
|
|||
in
|
||||
has-valid-start s && !(has-lt s && has-tab-or-newline s);
|
||||
in
|
||||
with types; either
|
||||
(enum [ "_blank" "_self" "_parent" "_top" ])
|
||||
(types.addCheck str is-valid-target)
|
||||
;
|
||||
with types;
|
||||
either (enum [
|
||||
"_blank"
|
||||
"_self"
|
||||
"_parent"
|
||||
"_top"
|
||||
]) (types.addCheck str is-valid-target);
|
||||
};
|
||||
};
|
||||
|
||||
mkAttrs = attrs: with lib;
|
||||
mkAttrs =
|
||||
attrs:
|
||||
with lib;
|
||||
mkOption {
|
||||
type = submodule {
|
||||
options = global-attrs // attrs;
|
||||
|
@ -155,28 +171,33 @@ let
|
|||
default = { };
|
||||
};
|
||||
|
||||
print-attrs = with lib; attrs:
|
||||
print-attrs =
|
||||
with lib;
|
||||
attrs:
|
||||
# TODO: figure out how let attributes know how to print themselves without polluting the interface
|
||||
let
|
||||
result = trim (join " "
|
||||
(mapAttrsToList
|
||||
result = trim (
|
||||
join " " (
|
||||
mapAttrsToList
|
||||
# TODO: this needs to be smarter for boolean attributes
|
||||
# where the value must be written out explicitly.
|
||||
# probably the attribute itself should have its own `__toString`.
|
||||
(name: value:
|
||||
(
|
||||
name: value:
|
||||
if isBool value then
|
||||
if value then name else ""
|
||||
# TODO: some attributes must be explicitly empty
|
||||
else optionalString (toString value != "") ''${name}="${toString value}"''
|
||||
else
|
||||
optionalString (toString value != "") ''${name}="${toString value}"''
|
||||
)
|
||||
attrs
|
||||
)
|
||||
attrs)
|
||||
);
|
||||
in
|
||||
if attrs == null then throw "wat" else
|
||||
optionalString (stringLength result > 0) " " + result
|
||||
;
|
||||
if attrs == null then throw "wat" else optionalString (stringLength result > 0) " " + result;
|
||||
|
||||
print-element = name: attrs: content:
|
||||
print-element =
|
||||
name: attrs: content:
|
||||
with lib;
|
||||
# TODO: be smarter about content to save some space and repetition at the call sites
|
||||
squash (trim ''
|
||||
|
@ -187,16 +208,20 @@ let
|
|||
|
||||
print-element' = name: attrs: "<${name}${print-attrs attrs}>";
|
||||
|
||||
toString-unwrap = e:
|
||||
toString-unwrap =
|
||||
e:
|
||||
with lib;
|
||||
if isAttrs e
|
||||
then toString (head (attrValues e))
|
||||
else if isList e
|
||||
then toString (map toString-unwrap e)
|
||||
else e;
|
||||
if isAttrs e then
|
||||
toString (head (attrValues e))
|
||||
else if isList e then
|
||||
toString (map toString-unwrap e)
|
||||
else
|
||||
e;
|
||||
|
||||
elements = rec {
|
||||
document = { ... }: {
|
||||
document =
|
||||
{ ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
inherit (element-types) html;
|
||||
|
@ -210,7 +235,9 @@ let
|
|||
'';
|
||||
};
|
||||
|
||||
html = { name, ... }: {
|
||||
html =
|
||||
{ name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
|
@ -218,13 +245,17 @@ let
|
|||
};
|
||||
|
||||
config.categories = [ ];
|
||||
config.__toString = self: print-element name self.attrs ''
|
||||
config.__toString =
|
||||
self:
|
||||
print-element name self.attrs ''
|
||||
${self.head}
|
||||
${self.body}
|
||||
'';
|
||||
};
|
||||
|
||||
head = { name, ... }: {
|
||||
head =
|
||||
{ name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = with lib; {
|
||||
attrs = mkAttrs { };
|
||||
|
@ -248,19 +279,17 @@ let
|
|||
# https://developer.mozilla.org/en-US/docs/Web/HTML/Viewport_meta_tag#viewport_width_and_screen_width
|
||||
# this should not exist and no one should ever have to think about it
|
||||
meta.viewport = mkOption {
|
||||
type = submodule ({ ... }: {
|
||||
type = submodule (
|
||||
{ ... }:
|
||||
{
|
||||
# TODO: figure out how to render only non-default values
|
||||
options = {
|
||||
width = mkOption {
|
||||
type = with types; either
|
||||
(ints.between 1 10000)
|
||||
(enum [ "device-width" ]);
|
||||
type = with types; either (ints.between 1 10000) (enum [ "device-width" ]);
|
||||
default = "device-width"; # not default by standard
|
||||
};
|
||||
height = mkOption {
|
||||
type = with types; either
|
||||
(ints.between 1 10000)
|
||||
(enum [ "device-height" ]);
|
||||
type = with types; either (ints.between 1 10000) (enum [ "device-height" ]);
|
||||
default = "device-height"; # not default by standard (but seems to work if you don't set it)
|
||||
};
|
||||
initial-scale = mkOption {
|
||||
|
@ -289,7 +318,8 @@ let
|
|||
default = "resizes-visual";
|
||||
};
|
||||
};
|
||||
});
|
||||
}
|
||||
);
|
||||
default = { };
|
||||
};
|
||||
|
||||
|
@ -318,15 +348,18 @@ let
|
|||
};
|
||||
|
||||
config.categories = [ ];
|
||||
config.__toString = self:
|
||||
config.__toString =
|
||||
self:
|
||||
with lib;
|
||||
print-element name self.attrs ''
|
||||
${self.title}
|
||||
${with lib; optionalString (!isNull self.base) self.base}
|
||||
<meta charset="${self.meta.charset}" />
|
||||
|
||||
${/* https://html.spec.whatwg.org/multipage/semantics.html#attr-meta-http-equiv-x-ua-compatible */
|
||||
""}<meta http-equiv="X-UA-Compatible" content="IE=edge" />
|
||||
${
|
||||
# https://html.spec.whatwg.org/multipage/semantics.html#attr-meta-http-equiv-x-ua-compatible
|
||||
""
|
||||
}<meta http-equiv="X-UA-Compatible" content="IE=edge" />
|
||||
<!--
|
||||
TODO: make proper icon and preload types
|
||||
-->
|
||||
|
@ -336,25 +369,41 @@ let
|
|||
|
||||
${print-element' "meta" {
|
||||
name = "viewport";
|
||||
content = "${join ", " (mapAttrsToList (name: value: "${name}=${toString value}") self.meta.viewport) }";
|
||||
content = "${join ", " (
|
||||
mapAttrsToList (name: value: "${name}=${toString value}") self.meta.viewport
|
||||
)}";
|
||||
}}
|
||||
|
||||
${join "\n" (map
|
||||
(author: print-element' "meta" {
|
||||
${join "\n" (
|
||||
map (
|
||||
author:
|
||||
print-element' "meta" {
|
||||
name = "author";
|
||||
content = "${author}";
|
||||
})
|
||||
self.meta.authors)
|
||||
}
|
||||
) self.meta.authors
|
||||
)}
|
||||
|
||||
${join "\n" (map
|
||||
(stylesheet: print-element' "link" ({ rel = "stylesheet"; } // (removeAttrs stylesheet [ "categories" "__toString" ])))
|
||||
self.link.stylesheets)
|
||||
${join "\n" (
|
||||
map (
|
||||
stylesheet:
|
||||
print-element' "link" (
|
||||
{
|
||||
rel = "stylesheet";
|
||||
}
|
||||
// (removeAttrs stylesheet [
|
||||
"categories"
|
||||
"__toString"
|
||||
])
|
||||
)
|
||||
) self.link.stylesheets
|
||||
)}
|
||||
'';
|
||||
};
|
||||
|
||||
title = { name, ... }: {
|
||||
title =
|
||||
{ name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options.attrs = mkAttrs { };
|
||||
options.text = mkOption {
|
||||
|
@ -365,15 +414,21 @@ let
|
|||
|
||||
};
|
||||
|
||||
base = { name, ... }: {
|
||||
base =
|
||||
{ ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
# TODO: "A base element must have either an href attribute, a target attribute, or both."
|
||||
options = global-attrs // { inherit (attrs) href target; };
|
||||
options = global-attrs // {
|
||||
inherit (attrs) href target;
|
||||
};
|
||||
config.categories = [ "metadata" ];
|
||||
config.__toString = self: "<base${print-attrs self}>";
|
||||
};
|
||||
|
||||
link = { name, ... }: {
|
||||
link =
|
||||
{ ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = global-attrs // {
|
||||
# TODO: more attributes
|
||||
|
@ -382,7 +437,9 @@ let
|
|||
# XXX: there are variants of `rel` for `link`, `a`/`area`, and `form`
|
||||
rel = mkOption {
|
||||
# https://html.spec.whatwg.org/multipage/semantics.html#attr-link-rel
|
||||
type = with types; listOfUnique str (enum
|
||||
type =
|
||||
with types;
|
||||
listOfUnique str (enum
|
||||
# TODO: work out link types in detail, there are lots of additional constraints
|
||||
# https://html.spec.whatwg.org/multipage/links.html#linkTypes
|
||||
[
|
||||
|
@ -403,8 +460,7 @@ let
|
|||
"privacy-policy"
|
||||
"search"
|
||||
"terms-of-service"
|
||||
]
|
||||
);
|
||||
]);
|
||||
};
|
||||
};
|
||||
# TODO: figure out how to make body-ok `link` elements
|
||||
|
@ -415,7 +471,9 @@ let
|
|||
|
||||
# <link rel="stylesheet"> is implemented separately because it can be used both in `<head>` and `<body>`
|
||||
# semantically it's a standalone thing but syntactically happens to be subsumed under `<link>`
|
||||
stylesheet = { config, name, ... }: {
|
||||
stylesheet =
|
||||
{ config, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = global-attrs // {
|
||||
type = mkOption {
|
||||
|
@ -440,49 +498,74 @@ let
|
|||
inherit (link-attrs) href media integrity;
|
||||
};
|
||||
# https://html.spec.whatwg.org/multipage/links.html#link-type-stylesheet:body-ok
|
||||
config.categories = [ "metadata" "phrasing" ];
|
||||
config.__toString = self: print-attrs (removeAttrs self [ "categories" "__toString" ]);
|
||||
config.categories = [
|
||||
"metadata"
|
||||
"phrasing"
|
||||
];
|
||||
config.__toString =
|
||||
self:
|
||||
print-attrs (
|
||||
removeAttrs self [
|
||||
"categories"
|
||||
"__toString"
|
||||
]
|
||||
);
|
||||
};
|
||||
|
||||
body = { config, name, ... }: {
|
||||
body =
|
||||
{ config, name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
content = mkOption {
|
||||
type = with types;
|
||||
type =
|
||||
with types;
|
||||
let
|
||||
# Type check that ensures spec-compliant section hierarchy
|
||||
# https://html.spec.whatwg.org/multipage/sections.html#headings-and-outlines-2:concept-heading-7
|
||||
with-section-constraints = baseType: baseType // {
|
||||
merge = loc: defs:
|
||||
with-section-constraints =
|
||||
baseType:
|
||||
baseType
|
||||
// {
|
||||
merge =
|
||||
loc: defs:
|
||||
with lib;
|
||||
let
|
||||
find-and-attach = def:
|
||||
find-and-attach =
|
||||
def:
|
||||
let
|
||||
process-with-depth = depth: content:
|
||||
map
|
||||
(x:
|
||||
if isAttrs x && x ? section
|
||||
then x // {
|
||||
process-with-depth =
|
||||
depth: content:
|
||||
map (
|
||||
x:
|
||||
if isAttrs x && x ? section then
|
||||
x
|
||||
// {
|
||||
section = x.section // {
|
||||
heading-level = depth;
|
||||
content = process-with-depth (depth + 1) (x.section.content or [ ]);
|
||||
};
|
||||
}
|
||||
else x
|
||||
)
|
||||
content;
|
||||
else
|
||||
x
|
||||
) content;
|
||||
|
||||
find-with-depth = depth: content:
|
||||
find-with-depth =
|
||||
depth: content:
|
||||
let
|
||||
sections = map (v: { inherit (def) file; value = v; depth = depth; })
|
||||
(filter (x: isAttrs x && x ? section) content);
|
||||
subsections = concatMap
|
||||
(x:
|
||||
if isAttrs x && x ? section && x.section ? content
|
||||
then find-with-depth (depth + 1) x.section.content
|
||||
else [ ])
|
||||
content;
|
||||
sections = map (v: {
|
||||
inherit (def) file;
|
||||
value = v;
|
||||
depth = depth;
|
||||
}) (filter (x: isAttrs x && x ? section) content);
|
||||
subsections = concatMap (
|
||||
x:
|
||||
if isAttrs x && x ? section && x.section ? content then
|
||||
find-with-depth (depth + 1) x.section.content
|
||||
else
|
||||
[ ]
|
||||
) content;
|
||||
in
|
||||
sections ++ subsections;
|
||||
|
||||
|
@ -500,9 +583,12 @@ let
|
|||
if too-deep != [ ] then
|
||||
throw ''
|
||||
The option `${lib.options.showOption loc}` has sections nested too deeply:
|
||||
${concatMapStrings (sec: " - depth ${toString sec.depth} section in ${toString sec.file}\n") too-deep}
|
||||
${concatMapStrings (
|
||||
sec: " - depth ${toString sec.depth} section in ${toString sec.file}\n"
|
||||
) too-deep}
|
||||
Section hierarchy must not be deeper than 6 levels.''
|
||||
else baseType.merge loc (map (p: p.def // { value = p.processed; }) processed);
|
||||
else
|
||||
baseType.merge loc (map (p: p.def // { value = p.processed; }) processed);
|
||||
};
|
||||
in
|
||||
with-section-constraints
|
||||
|
@ -513,16 +599,22 @@ let
|
|||
};
|
||||
|
||||
config.categories = [ ];
|
||||
config.__toString = self: with lib;
|
||||
print-element name self.attrs (join "\n" (map toString-unwrap self.content));
|
||||
config.__toString =
|
||||
self: with lib; print-element name self.attrs (join "\n" (map toString-unwrap self.content));
|
||||
};
|
||||
|
||||
section = { config, name, ... }: {
|
||||
section =
|
||||
{ config, name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
# setting to an attribute set will wrap the section in `<section>`
|
||||
attrs = mkOption {
|
||||
type = with types; nullOr (submodule { options = global-attrs; });
|
||||
type =
|
||||
with types;
|
||||
nullOr (submodule {
|
||||
options = global-attrs;
|
||||
});
|
||||
default = null;
|
||||
};
|
||||
heading = mkOption {
|
||||
|
@ -534,13 +626,21 @@ let
|
|||
# such an outline is rather meaningless without headings for navigation,
|
||||
# which is why we enforce headings in sections.
|
||||
# arguably, and this is encoded here, a section *is defined* by its heading.
|
||||
type = with types; submodule ({ config, ... }: {
|
||||
type =
|
||||
with types;
|
||||
submodule (
|
||||
{ config, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
# setting to an attribute set will wrap the section in `<hgroup>`
|
||||
hgroup.attrs = mkOption {
|
||||
type = with types; nullOr (submodule { options = global-attrs; });
|
||||
type =
|
||||
with types;
|
||||
nullOr (submodule {
|
||||
options = global-attrs;
|
||||
});
|
||||
default = with lib; if (config.before == [ ] && config.after == [ ]) then null else { };
|
||||
};
|
||||
# https://html.spec.whatwg.org/multipage/sections.html#the-hgroup-element
|
||||
|
@ -553,12 +653,12 @@ let
|
|||
type = with types; either str (listOf (attrTag categories.phrasing));
|
||||
};
|
||||
after = mkOption {
|
||||
type = with types;
|
||||
listOf (attrTag ({ inherit (element-types) p; } // categories.scripting));
|
||||
type = with types; listOf (attrTag ({ inherit (element-types) p; } // categories.scripting));
|
||||
default = [ ];
|
||||
};
|
||||
};
|
||||
});
|
||||
}
|
||||
);
|
||||
};
|
||||
# https://html.spec.whatwg.org/multipage/sections.html#headings-and-outlines
|
||||
content = mkOption {
|
||||
|
@ -573,28 +673,35 @@ let
|
|||
internal = true;
|
||||
};
|
||||
config = {
|
||||
categories = [ "flow" "sectioning" "palpable" ];
|
||||
__toString = self: with lib;
|
||||
categories = [
|
||||
"flow"
|
||||
"sectioning"
|
||||
"palpable"
|
||||
];
|
||||
__toString =
|
||||
self:
|
||||
with lib;
|
||||
let
|
||||
n = toString config.heading-level;
|
||||
heading = ''<h${n}${print-attrs self.heading.attrs}>${self.heading.content}</h${n}>'';
|
||||
hgroup = with lib; print-element "hgroup" self.heading.hgroup.attrs (squash ''
|
||||
hgroup =
|
||||
with lib;
|
||||
print-element "hgroup" self.heading.hgroup.attrs (squash ''
|
||||
${optionalString (!isNull self.heading.before) (toString-unwrap self.heading.before)}
|
||||
${heading}
|
||||
${optionalString (!isNull self.heading.after) (toString-unwrap self.heading.after)}
|
||||
'');
|
||||
content =
|
||||
(if isNull self.heading.hgroup.attrs then heading else hgroup)
|
||||
+
|
||||
join "\n" (map toString-unwrap self.content);
|
||||
+ join "\n" (map toString-unwrap self.content);
|
||||
in
|
||||
if !isNull self.attrs
|
||||
then print-element name self.attrs content
|
||||
else content;
|
||||
if !isNull self.attrs then print-element name self.attrs content else content;
|
||||
};
|
||||
};
|
||||
|
||||
p = { name, ... }: {
|
||||
p =
|
||||
{ name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
|
@ -602,20 +709,34 @@ let
|
|||
type = with types; either str (listOf (attrTag categories.phrasing));
|
||||
};
|
||||
};
|
||||
config.categories = [ "flow" "palpable" ];
|
||||
config.categories = [
|
||||
"flow"
|
||||
"palpable"
|
||||
];
|
||||
config.__toString = self: print-element name self.attrs (toString self.content);
|
||||
};
|
||||
|
||||
dl = { config, name, ... }: {
|
||||
dl =
|
||||
{ config, name, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
content = mkOption {
|
||||
type = with types; listOf (submodule ({ ... }: {
|
||||
type =
|
||||
with types;
|
||||
listOf (
|
||||
submodule (
|
||||
{ ... }:
|
||||
{
|
||||
options = {
|
||||
# TODO: wrap in `<div>` if set
|
||||
div.attrs = mkOption {
|
||||
type = with types; nullOr (submodule { options = global-attrs; });
|
||||
type =
|
||||
with types;
|
||||
nullOr (submodule {
|
||||
options = global-attrs;
|
||||
});
|
||||
default = null;
|
||||
};
|
||||
before = mkOption {
|
||||
|
@ -637,7 +758,9 @@ let
|
|||
default = [ ];
|
||||
};
|
||||
};
|
||||
}));
|
||||
}
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
# XXX: here we can't express the spec requirement that `dl` is palpable if the list of term-description-pairs is nonempty.
|
||||
|
@ -648,11 +771,12 @@ let
|
|||
# it does help to concisely express type constraints on an element's children, but it seems that most of the categories in the spec can be ignored entirely in this implementation.
|
||||
# the cleanup task would be to identify which categories are really helpful, and document the rationale for using that mechanism as well as the specific choice of categories to keep.
|
||||
config.categories = [ "flow" ];
|
||||
config.__toString = self:
|
||||
config.__toString =
|
||||
self:
|
||||
with lib;
|
||||
let
|
||||
content = map
|
||||
(entry:
|
||||
content = map (
|
||||
entry:
|
||||
let
|
||||
list = squash ''
|
||||
${join "\n" entry.before}
|
||||
|
@ -662,36 +786,53 @@ let
|
|||
${join "\n" entry.after}
|
||||
'';
|
||||
in
|
||||
if !isNull entry.div.attrs
|
||||
then print-element "div" entry.div.attrs list
|
||||
else list
|
||||
)
|
||||
self.content;
|
||||
if !isNull entry.div.attrs then print-element "div" entry.div.attrs list else list
|
||||
) self.content;
|
||||
in
|
||||
print-element name self.attrs (join "\n" content);
|
||||
};
|
||||
|
||||
dt = { config, ... }: {
|
||||
dt =
|
||||
{ config, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
dt = mkOption {
|
||||
type = with types; either str (submodule (attrTag (
|
||||
type =
|
||||
with types;
|
||||
either str (
|
||||
submodule (
|
||||
attrTag (
|
||||
# TODO: test
|
||||
with lib; removeAttrs
|
||||
(filterAttrs
|
||||
(name: value: ! any (c: elem c [ "sectioning" "heading" ]) value.categories)
|
||||
categories.flow
|
||||
with lib;
|
||||
removeAttrs
|
||||
(filterAttrs (
|
||||
_name: value:
|
||||
!any (
|
||||
c:
|
||||
elem c [
|
||||
"sectioning"
|
||||
"heading"
|
||||
]
|
||||
) value.categories
|
||||
) categories.flow)
|
||||
[
|
||||
"header"
|
||||
"footer"
|
||||
]
|
||||
)
|
||||
[ "header" "footer" ]
|
||||
)));
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
config.categories = [ ];
|
||||
config.__toString = self: print-element "dt" self.attrs self.dt;
|
||||
};
|
||||
|
||||
dd = { config, ... }: {
|
||||
dd =
|
||||
{ config, ... }:
|
||||
{
|
||||
imports = [ element ];
|
||||
options = {
|
||||
attrs = mkAttrs { };
|
||||
|
|
|
@ -1,9 +1,19 @@
|
|||
{ config, lib, pkgs, ... }: {
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
config.assets."style.css".path = ./style.css;
|
||||
config.assets."ngi-fediversity.svg".path = ./ngi-fediversity.svg;
|
||||
# TODO: auto-generate a bunch from SVG
|
||||
config.assets."favicon.png".path = ./favicon.png;
|
||||
config.assets."fonts.css".path = with lib; builtins.toFile "fonts.css" (join "\n" (map
|
||||
config.assets."fonts.css".path =
|
||||
with lib;
|
||||
builtins.toFile "fonts.css" (
|
||||
join "\n" (
|
||||
map
|
||||
(font: ''
|
||||
@font-face {
|
||||
font-family: '${font.name}';
|
||||
|
@ -13,15 +23,31 @@
|
|||
}
|
||||
'')
|
||||
(
|
||||
(crossLists (name: file: weight: { inherit name file weight; })
|
||||
[ [ "Signika" ] [ "signika-extended.woff2" "signika.woff2" ] [ 500 700 ] ]
|
||||
)
|
||||
++
|
||||
(crossLists (name: file: weight: { inherit name file weight; })
|
||||
[ [ "Heebo" ] [ "heebo-extended.woff2" "heebo.woff2" ] [ 400 600 ] ]
|
||||
(crossLists (name: file: weight: { inherit name file weight; }) [
|
||||
[ "Signika" ]
|
||||
[
|
||||
"signika-extended.woff2"
|
||||
"signika.woff2"
|
||||
]
|
||||
[
|
||||
500
|
||||
700
|
||||
]
|
||||
])
|
||||
++ (crossLists (name: file: weight: { inherit name file weight; }) [
|
||||
[ "Heebo" ]
|
||||
[
|
||||
"heebo-extended.woff2"
|
||||
"heebo.woff2"
|
||||
]
|
||||
[
|
||||
400
|
||||
600
|
||||
]
|
||||
])
|
||||
)
|
||||
)
|
||||
));
|
||||
);
|
||||
|
||||
# TODO: get directly from https://github.com/google/fonts
|
||||
# and compress with https://github.com/fonttools/fonttools
|
||||
|
|
|
@ -1,17 +1,21 @@
|
|||
{ config, options, lib, pkgs, ... }:
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
types
|
||||
;
|
||||
in
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
|
||||
{
|
||||
config.templates.html = {
|
||||
dom = document:
|
||||
dom =
|
||||
document:
|
||||
let
|
||||
eval = lib.evalModules {
|
||||
class = "DOM";
|
||||
modules = [ document (import ./dom.nix) ];
|
||||
modules = [
|
||||
document
|
||||
(import ./dom.nix)
|
||||
];
|
||||
};
|
||||
in
|
||||
{
|
||||
|
@ -19,27 +23,34 @@ in
|
|||
value = eval.config;
|
||||
};
|
||||
|
||||
markdown = { name, body }:
|
||||
markdown =
|
||||
{ name, body }:
|
||||
let
|
||||
commonmark = pkgs.runCommand "${name}.html"
|
||||
commonmark =
|
||||
pkgs.runCommand "${name}.html"
|
||||
{
|
||||
buildInputs = [ pkgs.cmark ];
|
||||
} ''
|
||||
}
|
||||
''
|
||||
cmark ${builtins.toFile "${name}.md" body} > $out
|
||||
'';
|
||||
in
|
||||
builtins.readFile commonmark;
|
||||
nav = { menu, page }:
|
||||
nav =
|
||||
{ menu, page }:
|
||||
let
|
||||
render-item = item:
|
||||
if item ? menu then ''
|
||||
render-item =
|
||||
item:
|
||||
if item ? menu then
|
||||
''
|
||||
<li><details><summary>${item.menu.label}</summary>
|
||||
${lib.indent " " (item.menu.outputs.html page)}
|
||||
</li>
|
||||
''
|
||||
else if item ? page then ''<li><a href="${page.link item.page}">${item.page.title}</a></li>''
|
||||
else ''<li><a href="${item.link.url}">${item.link.label}</a></li>''
|
||||
;
|
||||
else if item ? page then
|
||||
''<li><a href="${page.link item.page}">${item.page.title}</a></li>''
|
||||
else
|
||||
''<li><a href="${item.link.url}">${item.link.label}</a></li>'';
|
||||
in
|
||||
''
|
||||
<nav>
|
||||
|
@ -50,17 +61,27 @@ in
|
|||
'';
|
||||
};
|
||||
|
||||
config.templates.files = fs: with lib;
|
||||
config.templates.files =
|
||||
fs:
|
||||
with lib;
|
||||
foldl'
|
||||
# TODO: create static redirects from `tail <collection>.locations`
|
||||
(acc: elem: acc // (mapAttrs' (type: value: {
|
||||
(
|
||||
acc: elem:
|
||||
acc
|
||||
//
|
||||
(mapAttrs' (
|
||||
type: value: {
|
||||
name = head elem.locations + optionalString (type != "") ".${type}";
|
||||
value = if isStorePath value then value else
|
||||
builtins.toFile
|
||||
(elem.name + optionalString (type != "") ".${type}")
|
||||
(toString value);
|
||||
}))
|
||||
elem.outputs)
|
||||
value =
|
||||
if isStorePath value then
|
||||
value
|
||||
else
|
||||
builtins.toFile (elem.name + optionalString (type != "") ".${type}") (toString value);
|
||||
}
|
||||
))
|
||||
elem.outputs
|
||||
)
|
||||
{ }
|
||||
fs;
|
||||
}
|
||||
|
|
|
@ -1,12 +1,20 @@
|
|||
{ config, options, lib, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib) mkOption
|
||||
inherit (lib)
|
||||
mkOption
|
||||
types
|
||||
;
|
||||
cfg = config;
|
||||
in
|
||||
{
|
||||
content-types.article = { config, collection, ... }: {
|
||||
content-types.article =
|
||||
{ config, collection, ... }:
|
||||
{
|
||||
imports = [ cfg.content-types.page ];
|
||||
options = {
|
||||
collection = mkOption {
|
||||
|
@ -26,26 +34,29 @@ in
|
|||
};
|
||||
};
|
||||
config.name = with lib; mkDefault (slug config.title);
|
||||
config.outputs.html = lib.mkForce
|
||||
((cfg.templates.html.page config).override (final: prev: {
|
||||
config.outputs.html = lib.mkForce (
|
||||
(cfg.templates.html.page config).override (
|
||||
_final: prev: {
|
||||
html = {
|
||||
# TODO: make authors always a list
|
||||
head.meta.authors = if lib.isList config.author then config.author else [ config.author ];
|
||||
body.content = with lib; map
|
||||
(e:
|
||||
if isAttrs e && e ? section
|
||||
then
|
||||
recursiveUpdate e
|
||||
{
|
||||
body.content =
|
||||
with lib;
|
||||
map (
|
||||
e:
|
||||
if isAttrs e && e ? section then
|
||||
recursiveUpdate e {
|
||||
section.heading = {
|
||||
before = [{ p.content = "Published ${config.date}"; }];
|
||||
after = [{ p.content = "Written by ${config.author}"; }];
|
||||
before = [ { p.content = "Published ${config.date}"; } ];
|
||||
after = [ { p.content = "Written by ${config.author}"; } ];
|
||||
};
|
||||
}
|
||||
else e
|
||||
)
|
||||
prev.html.body.content;
|
||||
else
|
||||
e
|
||||
) prev.html.body.content;
|
||||
};
|
||||
}));
|
||||
}
|
||||
)
|
||||
);
|
||||
};
|
||||
}
|
||||
|
|
|
@ -11,24 +11,34 @@ in
|
|||
description = ''
|
||||
Collection of assets, i.e. static files that can be linked to from within documents
|
||||
'';
|
||||
type = with types; attrsOf (submodule ({ config, ... }: {
|
||||
type =
|
||||
with types;
|
||||
attrsOf (
|
||||
submodule (
|
||||
{ config, ... }:
|
||||
{
|
||||
imports = [ cfg.content-types.document ];
|
||||
options.path = mkOption {
|
||||
type = types.path;
|
||||
};
|
||||
config.outputs."" = if lib.isStorePath config.path then config.path else "${config.path}";
|
||||
}));
|
||||
}
|
||||
)
|
||||
);
|
||||
default = { };
|
||||
};
|
||||
|
||||
config.files = with lib;
|
||||
config.files =
|
||||
with lib;
|
||||
let
|
||||
flatten = attrs: mapAttrsToList
|
||||
(name: value:
|
||||
flatten =
|
||||
attrs:
|
||||
mapAttrsToList (
|
||||
_name: value:
|
||||
# HACK: we somehow have to distinguish a module value from regular attributes.
|
||||
# arbitrary choice: the outputs attribute
|
||||
if value ? outputs then value else mapAttrsToList value)
|
||||
attrs;
|
||||
if value ? outputs then value else mapAttrsToList value
|
||||
) attrs;
|
||||
in
|
||||
cfg.templates.files (flatten cfg.assets);
|
||||
}
|
||||
|
|
|
@ -1,4 +1,10 @@
|
|||
{ config, options, lib, pkgs, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
|
@ -6,6 +12,7 @@ let
|
|||
;
|
||||
cfg = config;
|
||||
in
|
||||
|
||||
{
|
||||
options.collections = mkOption {
|
||||
description = ''
|
||||
|
@ -25,7 +32,12 @@ in
|
|||
}
|
||||
```
|
||||
'';
|
||||
type = with types; attrsOf (submodule ({ name, config, ... }: {
|
||||
type =
|
||||
with types;
|
||||
attrsOf (
|
||||
submodule (
|
||||
{ name, config, ... }:
|
||||
{
|
||||
options = {
|
||||
type = mkOption {
|
||||
description = "Type of entries in the collection";
|
||||
|
@ -52,7 +64,9 @@ in
|
|||
};
|
||||
entry = mkOption {
|
||||
description = "An entry in the collection";
|
||||
type = with types; collection (submodule ({
|
||||
type =
|
||||
with types;
|
||||
collection (submodule ({
|
||||
imports = [ config.type ];
|
||||
_module.args.collection = config;
|
||||
process-locations = ls: with lib; concatMap (l: map (p: "${p}/${l}") config.prefixes) ls;
|
||||
|
@ -61,10 +75,19 @@ in
|
|||
by-name = mkOption {
|
||||
description = "Entries accessible by symbolic name";
|
||||
type = with types; attrsOf attrs;
|
||||
default = with lib; listToAttrs (map (e: { name = e.name; value = e; }) config.entry);
|
||||
default =
|
||||
with lib;
|
||||
listToAttrs (
|
||||
map (e: {
|
||||
name = e.name;
|
||||
value = e;
|
||||
}) config.entry
|
||||
);
|
||||
};
|
||||
};
|
||||
}));
|
||||
}
|
||||
)
|
||||
);
|
||||
};
|
||||
|
||||
config.files =
|
||||
|
|
|
@ -1,10 +1,14 @@
|
|||
{ config, options, lib, pkgs, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
types
|
||||
;
|
||||
cfg = config;
|
||||
in
|
||||
{
|
||||
imports = lib.nixFiles ./.;
|
||||
|
@ -14,7 +18,15 @@ in
|
|||
type = with types; attrsOf deferredModule;
|
||||
};
|
||||
|
||||
config.content-types.document = { name, config, options, link, ... }: {
|
||||
config.content-types.document =
|
||||
{
|
||||
name,
|
||||
config,
|
||||
options,
|
||||
link,
|
||||
...
|
||||
}:
|
||||
{
|
||||
config._module.args.link = config.link;
|
||||
options = {
|
||||
name = mkOption {
|
||||
|
@ -37,7 +49,10 @@ in
|
|||
'';
|
||||
type = with types; nonEmptyListOf str;
|
||||
apply = config.process-locations;
|
||||
example = [ "about/overview" "index" ];
|
||||
example = [
|
||||
"about/overview"
|
||||
"index"
|
||||
];
|
||||
default = [ config.name ];
|
||||
};
|
||||
process-locations = mkOption {
|
||||
|
@ -51,24 +66,26 @@ in
|
|||
# TODO: we may want links to other representations,
|
||||
# and currently the mapping of output types to output file
|
||||
# names is soft.
|
||||
default = with lib; target:
|
||||
default =
|
||||
with lib;
|
||||
target:
|
||||
let
|
||||
path = relativePath (head config.locations) (head target.locations);
|
||||
links = mapAttrs
|
||||
(type: output:
|
||||
path + optionalString (type != "") ".${type}"
|
||||
links = mapAttrs (
|
||||
type: _output: path + optionalString (type != "") ".${type}"
|
||||
# ^^^^^^^^^^^^
|
||||
# convention for raw files
|
||||
)
|
||||
target.outputs;
|
||||
) target.outputs;
|
||||
in
|
||||
if length (attrValues links) == 0
|
||||
then throw "no output to link to for '${target.name}'"
|
||||
else if length (attrValues links) == 1
|
||||
then links // {
|
||||
if length (attrValues links) == 0 then
|
||||
throw "no output to link to for '${target.name}'"
|
||||
else if length (attrValues links) == 1 then
|
||||
links
|
||||
// {
|
||||
__toString = _: head (attrValues links);
|
||||
}
|
||||
else links;
|
||||
else
|
||||
links;
|
||||
};
|
||||
outputs = mkOption {
|
||||
description = ''
|
||||
|
|
|
@ -1,4 +1,9 @@
|
|||
{ config, options, lib, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
|
@ -7,7 +12,9 @@ let
|
|||
cfg = config;
|
||||
in
|
||||
{
|
||||
content-types.event = { config, collection, ... }: {
|
||||
content-types.event =
|
||||
{ config, collection, ... }:
|
||||
{
|
||||
imports = [ cfg.content-types.page ];
|
||||
options = {
|
||||
collection = mkOption {
|
||||
|
@ -41,41 +48,49 @@ in
|
|||
};
|
||||
config.name = with lib; mkDefault (slug config.title);
|
||||
config.summary = lib.mkDefault config.description;
|
||||
config.outputs.html = lib.mkForce
|
||||
((cfg.templates.html.page config).override (final: prev: {
|
||||
html.body.content = with lib; map
|
||||
(e:
|
||||
if isAttrs e && e ? section
|
||||
then
|
||||
recursiveUpdate e
|
||||
{
|
||||
config.outputs.html = lib.mkForce (
|
||||
(cfg.templates.html.page config).override (
|
||||
_final: prev: {
|
||||
html.body.content =
|
||||
with lib;
|
||||
map (
|
||||
e:
|
||||
if isAttrs e && e ? section then
|
||||
recursiveUpdate e {
|
||||
section.content = [
|
||||
{
|
||||
dl.content = [
|
||||
dl.content =
|
||||
[
|
||||
{
|
||||
terms = [{ dt = "Location"; }];
|
||||
descriptions = [{ dd = config.location; }];
|
||||
terms = [ { dt = "Location"; } ];
|
||||
descriptions = [ { dd = config.location; } ];
|
||||
}
|
||||
{
|
||||
terms = [{ dt = "Start"; }];
|
||||
descriptions = [{
|
||||
terms = [ { dt = "Start"; } ];
|
||||
descriptions = [
|
||||
{
|
||||
dd = config.start-date + lib.optionalString (!isNull config.start-time) " ${config.start-time}";
|
||||
}];
|
||||
}
|
||||
] ++ lib.optional (!isNull config.end-date) {
|
||||
terms = [{ dt = "End"; }];
|
||||
descriptions = [{
|
||||
dd = config.end-date + lib.optionalString (!isNull config.end-time) " ${config.end-time}";
|
||||
}];
|
||||
};
|
||||
];
|
||||
}
|
||||
]
|
||||
++ e.section.content;
|
||||
++ lib.optional (!isNull config.end-date) {
|
||||
terms = [ { dt = "End"; } ];
|
||||
descriptions = [
|
||||
{
|
||||
dd = config.end-date + lib.optionalString (!isNull config.end-time) " ${config.end-time}";
|
||||
}
|
||||
else e
|
||||
)
|
||||
prev.html.body.content;
|
||||
];
|
||||
};
|
||||
}
|
||||
] ++ e.section.content;
|
||||
}
|
||||
else
|
||||
e
|
||||
) prev.html.body.content;
|
||||
|
||||
}));
|
||||
}
|
||||
)
|
||||
);
|
||||
};
|
||||
}
|
||||
|
|
|
@ -1,11 +1,18 @@
|
|||
{ config, options, lib, ... }:
|
||||
{
|
||||
config,
|
||||
options,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib)
|
||||
mkOption
|
||||
types
|
||||
;
|
||||
cfg = config;
|
||||
subtype = baseModule: types.submodule [
|
||||
subtype =
|
||||
baseModule:
|
||||
types.submodule [
|
||||
baseModule
|
||||
{
|
||||
_module.freeformType = types.attrs;
|
||||
|
@ -23,7 +30,9 @@ in
|
|||
type = with types; attrsOf (submodule config.content-types.navigation);
|
||||
};
|
||||
|
||||
config.content-types.named-link = { ... }: {
|
||||
config.content-types.named-link =
|
||||
{ ... }:
|
||||
{
|
||||
options = {
|
||||
label = mkOption {
|
||||
description = "Link label";
|
||||
|
@ -36,7 +45,9 @@ in
|
|||
};
|
||||
};
|
||||
|
||||
config.content-types.navigation = { name, config, ... }: {
|
||||
config.content-types.navigation =
|
||||
{ name, config, ... }:
|
||||
{
|
||||
options = {
|
||||
name = mkOption {
|
||||
description = "Symbolic name, used as a human-readable identifier";
|
||||
|
@ -50,7 +61,9 @@ in
|
|||
};
|
||||
items = mkOption {
|
||||
description = "List of menu items";
|
||||
type = with types; listOf (attrTag {
|
||||
type =
|
||||
with types;
|
||||
listOf (attrTag {
|
||||
menu = mkOption { type = submodule cfg.content-types.navigation; };
|
||||
page = mkOption { type = subtype cfg.content-types.page; };
|
||||
link = mkOption { type = submodule cfg.content-types.named-link; };
|
||||
|
@ -63,8 +76,11 @@ in
|
|||
It must be a function that takes the page on which the navigation is to be shown, such that relative links get computed correctly.
|
||||
'';
|
||||
type = with types; attrsOf (functionTo str);
|
||||
default.html = page: cfg.templates.html.nav {
|
||||
menu = config; inherit page;
|
||||
default.html =
|
||||
page:
|
||||
cfg.templates.html.nav {
|
||||
menu = config;
|
||||
inherit page;
|
||||
};
|
||||
};
|
||||
};
|
||||
|
|
|
@ -17,7 +17,9 @@ in
|
|||
|
||||
config.files = with lib; cfg.templates.files (attrValues config.pages);
|
||||
|
||||
config.content-types.page = { name, config, ... }: {
|
||||
config.content-types.page =
|
||||
{ name, config, ... }:
|
||||
{
|
||||
imports = [ cfg.content-types.document ];
|
||||
options = {
|
||||
title = mkOption {
|
||||
|
|
|
@ -4,14 +4,35 @@ let
|
|||
inherit (import ./. { }) lib;
|
||||
in
|
||||
{
|
||||
test-relativePath = with lib;
|
||||
test-relativePath =
|
||||
with lib;
|
||||
let
|
||||
testData = [
|
||||
{ from = "bar"; to = "baz"; expected = "./baz"; }
|
||||
{ from = "foo/bar"; to = "foo/baz"; expected = "./baz"; }
|
||||
{ from = "foo"; to = "bar/baz"; expected = "./bar/baz"; }
|
||||
{ from = "foo/bar"; to = "baz"; expected = "./../baz"; }
|
||||
{ from = "foo/bar/baz"; to = "foo"; expected = "./../../foo"; }
|
||||
{
|
||||
from = "bar";
|
||||
to = "baz";
|
||||
expected = "./baz";
|
||||
}
|
||||
{
|
||||
from = "foo/bar";
|
||||
to = "foo/baz";
|
||||
expected = "./baz";
|
||||
}
|
||||
{
|
||||
from = "foo";
|
||||
to = "bar/baz";
|
||||
expected = "./bar/baz";
|
||||
}
|
||||
{
|
||||
from = "foo/bar";
|
||||
to = "baz";
|
||||
expected = "./../baz";
|
||||
}
|
||||
{
|
||||
from = "foo/bar/baz";
|
||||
to = "foo";
|
||||
expected = "./../../foo";
|
||||
}
|
||||
];
|
||||
in
|
||||
{
|
||||
|
|
Loading…
Add table
Reference in a new issue