Compare commits

...

102 Commits

Author SHA1 Message Date
Tomáš Mládek 72b928067c chore(jslib): add timeout log to api calls
ci/woodpecker/push/woodpecker Pipeline was successful Details
Might potentially help with spurious AbortError issues
2024-05-04 16:59:17 +02:00
Tomáš Mládek 7e9d4349af feat(webui): upload to groups via EntityList
ci/woodpecker/push/woodpecker Pipeline was successful Details
(finishes #21)
2024-04-21 22:03:17 +02:00
Tomáš Mládek 426c584215 feat(webui): AddModal allows upload directly to groups
(addresses #21)
2024-04-21 22:03:17 +02:00
Tomáš Mládek 1118a5cfeb refactor(webui): typed Selector events 2024-04-21 22:03:17 +02:00
Tomáš Mládek e9dd4d1383 fix(webui): don't show editable label in UpObjectCard 2024-04-21 21:19:44 +02:00
Tomáš Mládek e06d2bccfe style(webui): add icons to Inspect sections
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-12 15:40:41 +02:00
Tomáš Mládek 9f61581ba7 style(webui): add icons to InspectTypeEditor 2024-04-12 15:29:32 +02:00
Tomáš Mládek bc74fbfff6 style(webui): fix key alignment in UpObject 2024-04-12 15:27:30 +02:00
Tomáš Mládek 8d165e1f8c style(webui): fix button alignment in entry lists
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-04-12 15:25:47 +02:00
Tomáš Mládek 97f6dd86bf style(webui): LabelBorder hidden state is indicated by double border
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-04-12 15:21:43 +02:00
Tomáš Mládek 041c058a77 refactor(webui): LabelBorder uses Svelte transitions, tidy CSS 2024-04-12 15:20:47 +02:00
Tomáš Mládek 1bd83062bb fix(webui): Inspect correctly detects un/typed entries of a group
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-04-12 15:06:55 +02:00
Tomáš Mládek 58c5329781 fix(webui): Footer correctly displays over content
also a11y fixes, import fix
2024-04-12 15:03:17 +02:00
Tomáš Mládek 07a150b99d fix: jslib wrong query param
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-08 22:14:37 +02:00
Tomáš Mládek 1738643050 ci: add SENTRY_AUTH_TOKEN secret, fix source map uploads 2024-04-08 21:53:57 +02:00
Tomáš Mládek 3b32597fb6 feat(jslib): getRaw can return authenticated url
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-08 21:40:14 +02:00
Tomáš Mládek a30ef465a3 Revert "fix: temporarily (?) disable auth on /raw endpoint"
ci/woodpecker/push/woodpecker Pipeline failed Details
This reverts commit 750bca9ee0.
2024-04-08 21:34:27 +02:00
Tomáš Mládek 069c86855b feat: accept auth key in query param 2024-04-08 21:34:08 +02:00
Tomáš Mládek f9002604fe style(webui): link UpObject can be clicked whole
ci/woodpecker/push/woodpecker Pipeline was successful Details
also slight refactor on UpObject especially banner and button sizing fixes
2024-04-06 00:35:11 +02:00
Tomáš Mládek edc666f56a fix: errant > 2024-04-06 00:35:11 +02:00
Tomáš Mládek 750bca9ee0 fix: temporarily (?) disable auth on /raw endpoint
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-04 23:07:05 +02:00
Tomáš Mládek 703a3e5391 fix: add `name` attributes to login modal, prompt browser to save credentials
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-04 22:48:40 +02:00
Tomáš Mládek 50020b969e fix: don't reveal whether a user exists
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-04 21:27:44 +02:00
Tomáš Mládek 60a8b15164 feat(webui): users can change their passwords 2024-04-04 21:27:44 +02:00
Tomáš Mládek 17bc53a6fe feat: add Sentry user feedback
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-04-04 20:25:04 +02:00
Tomáš Mládek f9037a4370 refactor: config object is fully optional for SDK js, message for errors
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-03 11:10:34 +02:00
Tomáš Mládek 196447da0f feat: add `user` to every Entry
(very ugly, lots of clones)
2024-04-03 11:10:34 +02:00
Tomáš Mládek 05ee557d1a feat: add user management
- no more static keys, full register/login/logout flow
- add API error type
- refactor API to centralize request calls
- minor refactors re: vault options
- CSS refactor (buttons don't require classes, input styling)
2024-04-03 11:10:34 +02:00
Tomáš Mládek 02bfe94f39 feat(backend): users with passwords 2024-04-03 11:10:34 +02:00
Tomáš Mládek 0e59bc8bd5 style(webui): contain COVERs in UpObject headers 2024-04-03 11:10:34 +02:00
Tomáš Mládek 8932341445 fix(webui): action buttons no longer hidden on entries with long labels
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-04-02 16:49:04 +02:00
Tomáš Mládek 1f270d6dc7 feat(webui): quality of life improvements for upload dialog
ci/woodpecker/push/woodpecker Pipeline was successful Details
- when uploading, warn before closing tab
- allow cancelling in progress uploads
- when uploading multiple files, scroll to the current file
2024-04-01 21:17:44 +02:00
Tomáš Mládek 669b348160 refactor: fix lint
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-03-31 17:09:23 +02:00
Tomáš Mládek 175518e3a6 refactor: allow known clippy issues 2024-03-31 17:09:23 +02:00
Tomáš Mládek 94818b992a dev: add +dev-update-sdk target
(why doesn't dev-local update as expected?)
2024-03-31 17:09:23 +02:00
Tomáš Mládek f2261998ee refactor: properly import tracing macros 2024-03-31 17:09:23 +02:00
Tomáš Mládek 730cc02d7a fix(base): null attribute deserializes correctly
also add type address de/serialization tests
2024-03-31 17:09:23 +02:00
Tomáš Mládek 4d8ac0717d fix(webui): don't disappear selectors while adding entries if input has been made
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-03-31 16:24:05 +02:00
Tomáš Mládek 68e7d67d7b fix(webui): upload modal correctly displays over content 2024-03-31 15:08:16 +02:00
Tomáš Mládek cb7dfadf3d feat(webui): add sentry
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-03-30 13:23:27 +01:00
Tomáš Mládek 35e1e902a2 feat: persist vault rescan mode if unset and passed via CLI
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-03-02 17:55:18 +01:00
Tomáš Mládek 1e9f83d043 dev: dedicated "local dependencies" earthly target
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-26 20:36:43 +01:00
Tomáš Mládek 88170789a0 dev: remove react from dependencies, fix WebStorm? 2024-02-26 20:36:43 +01:00
Tomáš Mládek e03e09ccaf lint: fix lint 2024-02-26 20:36:43 +01:00
Tomáš Mládek 58ca734443 style(webui): slightly smaller attribute in UpEntry 2024-02-26 20:36:43 +01:00
Tomáš Mládek 7897ce7354 fix(webui): UpEntry (selector) correct overflow
also add stories
2024-02-26 20:36:43 +01:00
Tomáš Mládek d87405ae5b dev: add intellij run configurations 2024-02-26 20:36:43 +01:00
Tomáš Mládek c5e14eae0d fix(webui): UpObject correct spacing 2024-02-26 20:36:43 +01:00
Tomáš Mládek 4ccfc63318 fix(webui): Ellipsis properly limits overflow 2024-02-26 20:36:43 +01:00
Tomáš Mládek 894faa94ae dev: add narrow UpObject story to test overflow/ellipsis 2024-02-26 20:36:43 +01:00
Tomáš Mládek 0b488d9384 lint fixes 2024-02-26 20:36:43 +01:00
Tomáš Mládek 121c615642 dev: (re) add storybook 2024-02-26 20:36:43 +01:00
Tomáš Mládek cd008c10e2 fix: extractors no longer crash (error due to refactor) 2024-02-19 22:35:02 +01:00
Tomáš Mládek 0ede2af16c dev: backend dev run configuration specifies rescan mode 2024-02-19 22:27:49 +01:00
Tomáš Mládek 3e5353a5a4 dev: update .earthlyignore to ignore all node_modules 2024-02-18 18:04:32 +01:00
Tomáš Mládek ff44061a21 refactor: fix scss lint
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-02-18 14:11:08 +01:00
Tomáš Mládek 794b130645 feat(webui): display `COVER` image as the column background
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-02-17 23:02:38 +01:00
Tomáš Mládek 2faa113691 feat(webui): labels can be edited via column header (banner)
ci/woodpecker/push/woodpecker Pipeline was successful Details
fixes #55
known issue: replaces all labels with one
2024-02-17 17:32:48 +01:00
Tomáš Mládek dd9ff79e20 fixup! fix(webui): editable respects initial value 2024-02-17 17:32:48 +01:00
Tomáš Mládek 050e3f81d7 refactor(webui): add types to some components' event dispatchers 2024-02-17 17:32:48 +01:00
Tomáš Mládek afe0b858b6 style(webui): Selector options have unified font size/weight, shadow 2024-02-17 17:32:48 +01:00
Tomáš Mládek 656dc23bfb fix(webui): IconButton passes down `plain` attribute
also has valid markup
2024-02-17 17:32:48 +01:00
Tomáš Mládek 1dd4f059d3 fix(webui): editable respects initial value 2024-02-17 17:32:48 +01:00
Tomáš Mládek 7b1c37eb54 dev: fix dev frontend run config 2024-02-17 17:32:48 +01:00
Tomáš Mládek a2396675c5 dev(jslib): fix js sdk lint 2024-02-17 17:32:48 +01:00
Tomáš Mládek ab17644b0d test(jslib): migrate from ava to jest
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-02-17 15:24:13 +01:00
Tomáš Mládek 4c3727451b refactor(jslib): separate `src` and `dist` dirs
(break tests)
2024-02-17 15:12:23 +01:00
Tomáš Mládek e32233c4f7 dev: move wasm to root
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-17 14:09:31 +01:00
Tomáš Mládek 473cb2ffa0 dev: move tools/upend_js,py to sdks 2024-02-17 14:09:30 +01:00
Tomáš Mládek 9b52eba0b4 dev: remove fromksx 2024-02-17 14:09:30 +01:00
Tomáš Mládek 052c56ed1d dev: remove Taskfile 2024-02-17 10:34:06 +01:00
Tomáš Mládek afa5bd088d refactor: Attributes are their proper type instead of strings
ci/woodpecker/push/woodpecker Pipeline was successful Details
Also adds checking for non-emptiness and upper-casing
2024-02-15 19:10:22 +01:00
Tomáš Mládek c5c157a856 fix(webui): fix cursor position on empty note
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-06 22:50:33 +01:00
Tomáš Mládek 3344e69544 feat(webui): notes can now contain newlines 2024-02-06 22:49:32 +01:00
Tomáš Mládek 33768e2695 feat(webui): add status indicator for notes editor
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-06 22:34:55 +01:00
Tomáš Mládek 9d6ebfc31c fix(webui): Notes aren't duplicated (manifested as unreliable saving)
also rework semantics of `WidgetChange`
2024-02-06 22:33:53 +01:00
Tomáš Mládek f1b608f824 fix(webui): upload dialog's position is fixed on screen
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-06 13:05:54 +01:00
Tomáš Mládek ea9aa96674 Update CHANGELOG
ci/woodpecker/tag/woodpecker Pipeline was successful Details
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-06 10:00:23 +01:00
Tomáš Mládek ce4e045e07 dev: git ignore uploaded files in example_vault
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-06 09:46:25 +01:00
Tomáš Mládek c246b267d1 feat(webui): start upload on Enter press 2024-02-06 09:46:25 +01:00
Tomáš Mládek 53135d4a9e style(webui): upload progress bar spacing, hide add button 2024-02-06 09:46:25 +01:00
Tomáš Mládek 3196294033 feat(webui): select all uploaded files when done 2024-02-06 09:46:25 +01:00
Tomáš Mládek 1d1476c7b8 dev: intellij dev config builds jslib before webui launch 2024-02-06 09:46:25 +01:00
Tomáš Mládek 9f2f7c0218 fix(jslib): fix types for `putBlob()`, returns a single address 2024-02-06 09:46:25 +01:00
Tomáš Mládek 787aa00f94 feat(webui): files can be added or removed from the upload dialog
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-02-05 22:23:28 +01:00
Tomáš Mládek de3ef7de0f dev(webui): force rebundling of dependencies for `dev` script
no need to delete node_modules anymore!
2024-02-05 22:23:28 +01:00
Tomáš Mládek ec81f8147b feat(webui,jslib): upload progress 2024-02-05 22:23:28 +01:00
Tomáš Mládek 59c2d9c078 ci: remove parallelization 2024-02-05 22:23:28 +01:00
Tomáš Mládek f18217a3e5 ci: update Earthly image version 2024-02-05 22:23:28 +01:00
Tomáš Mládek ba221c2662 ci: get rid of AppImage upload to S3 2024-02-05 22:23:28 +01:00
Tomáš Mládek c16ff963c8 build: fix upend-bin target
can't save artifacts from CACHEd locations, I guess
2024-02-05 22:23:28 +01:00
Tomáš Mládek 303ac3ec07 ...
ci: remove duplicate cargo build command
2024-02-05 22:22:53 +01:00
Tomáš Mládek 3dcfe48803 ci: cache all rust earthly targets 2024-02-05 22:22:53 +01:00
Tomáš Mládek e6862351f9 build: further refactor Earthfile & build process
separate strict/release & nightly builds, avoid LOCAL by default
2024-02-05 22:22:53 +01:00
Tomáš Mládek 2da5a28a42 build(webext): update shared paths with webui, fix build 2024-02-05 22:22:43 +01:00
Tomáš Mládek 316f236d3a ci: --force pnpm install, DRY Earthfile slightly
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-02-03 15:07:21 +01:00
Tomáš Mládek 1660585df3 ci: enable CACHE
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-02-03 00:33:51 +01:00
Tomáš Mládek 009007fc8b Update CHANGELOG
ci/woodpecker/push/woodpecker Pipeline failed Details
ci/woodpecker/tag/woodpecker Pipeline failed Details
2024-02-02 16:11:23 +01:00
Tomáš Mládek 298d92c9a5 refactor(webui): fix typo, rename ProgessBar -> ProgressBar 2024-02-02 16:11:00 +01:00
Tomáš Mládek f14c035051 fix(webui): fix upload, re-add forgotten components (Footer, AddModal, DropPasteHandler) 2024-02-02 16:10:39 +01:00
Tomáš Mládek d047eaf7ac ci: update Earthly image version
ci/woodpecker/push/woodpecker Pipeline failed Details
2024-01-30 23:00:43 +01:00
Tomáš Mládek f1184ad2b3 style(webui): fix uneven heights of roots
ci/woodpecker/push/woodpecker Pipeline was successful Details
2024-01-30 11:07:08 +01:00
142 changed files with 11531 additions and 2819 deletions

View File

@ -1,6 +1,7 @@
node_modules/*
/.pnpm/*
/.cargo/*
*/node_modules
.pnpm/*
.cargo/*
upend.sqlite3
.upend/*

2
.gitignore vendored
View File

@ -9,3 +9,5 @@ upend.sqlite3
/.pnpm
/.cargo
example_vault/zb*

View File

@ -1,6 +1,6 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="dev backend" type="CargoCommandRunConfiguration" factoryName="Cargo Command">
<option name="command" value="run -- serve ./example_vault --clean --no-browser --reinitialize" />
<option name="command" value="run -- serve ./example_vault --clean --no-browser --reinitialize --rescan-mode mirror --secret upend" />
<option name="workingDirectory" value="file://$PROJECT_DIR$" />
<envs />
<option name="emulateTerminal" value="true" />

View File

@ -0,0 +1,19 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="dev backend storybook" type="CargoCommandRunConfiguration" factoryName="Cargo Command">
<option name="command" value="run -- serve ./example_vault --clean --no-browser --reinitialize --rescan-mode mirror --bind 127.0.0.1:8099" />
<option name="workingDirectory" value="file://$PROJECT_DIR$" />
<envs />
<option name="emulateTerminal" value="true" />
<option name="channel" value="DEFAULT" />
<option name="requiredFeatures" value="true" />
<option name="allFeatures" value="false" />
<option name="withSudo" value="false" />
<option name="buildTarget" value="REMOTE" />
<option name="backtrace" value="SHORT" />
<option name="isRedirectInput" value="false" />
<option name="redirectInputPath" value="" />
<method v="2">
<option name="CARGO.BUILD_TASK_PROVIDER" enabled="true" />
</method>
</configuration>
</component>

View File

@ -7,6 +7,16 @@
</scripts>
<node-interpreter value="project" />
<envs />
<method v="2" />
<method v="2">
<option name="NpmBeforeRunTask" enabled="true">
<package-json value="$PROJECT_DIR$/sdks/js/package.json" />
<command value="run" />
<scripts>
<script value="build" />
</scripts>
<node-interpreter value="project" />
<envs />
</option>
</method>
</configuration>
</component>

View File

@ -0,0 +1,7 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="storybook" type="CompoundRunConfigurationType">
<toRun name="dev backend storybook" type="CargoCommandRunConfiguration" />
<toRun name="storybook:serve" type="js.build_tools.npm" />
<method v="2" />
</configuration>
</component>

View File

@ -0,0 +1,12 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="storybook:serve" type="js.build_tools.npm" nameIsGenerated="true">
<package-json value="$PROJECT_DIR$/webui/package.json" />
<command value="run" />
<scripts>
<script value="storybook:serve" />
</scripts>
<node-interpreter value="project" />
<envs />
<method v="2" />
</configuration>
</component>

View File

@ -0,0 +1,11 @@
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="test js sdk" type="JavaScriptTestRunnerJest">
<config-file value="$PROJECT_DIR$/sdks/js/jest.config.js" />
<node-interpreter value="project" />
<jest-package value="$PROJECT_DIR$/sdks/js/node_modules/jest" />
<working-dir value="$PROJECT_DIR$" />
<envs />
<scope-kind value="ALL" />
<method v="2" />
</configuration>
</component>

View File

@ -1,32 +1,32 @@
pipeline:
test:
image: earthly/earthly:v0.7.15
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION]
secrets: [ EARTHLY_CONFIGURATION ]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly +test
lint:
image: earthly/earthly:v0.7.15
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION]
secrets: [ EARTHLY_CONFIGURATION ]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly +lint
# audit:
# image: earthly/earthly:v0.7.15
# image: earthly/earthly:v0.8.3
# volumes:
# - /var/run/docker.sock:/var/run/docker.sock
# environment:
@ -38,54 +38,8 @@ pipeline:
# - earthly bootstrap
# - earthly +audit
gitea:prerelease:
image: earthly/earthly:v0.7.15
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly +current-changelog
when:
event: [tag]
appimage:
image: earthly/earthly:v0.7.15
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION, REGISTRY, REGISTRY_USER, REGISTRY_PASSWORD]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly +appimage
appimage:save:
group: push
image: woodpeckerci/plugin-s3
settings:
endpoint:
from_secret: S3_ENDPOINT
access_key:
from_secret: S3_ACCESS_KEY
secret_key:
from_secret: S3_SECRET_KEY
bucket: upend-ci-packages
path_style: true
region: anabasis
source: "dist/*.AppImage"
target: /
secrets: [S3_ENDPOINT, S3_ACCESS_KEY, S3_SECRET_KEY]
appimage:publish:nightly:
group: push
image: earthly/earthly:v0.7.15
appimage:nightly:
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
@ -98,23 +52,23 @@ pipeline:
SSH_CONFIG,
SSH_UPLOAD_KEY,
SSH_KNOWN_HOSTS,
SENTRY_AUTH_TOKEN
]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly --secret GPG_SIGN_KEY --secret SSH_CONFIG --secret SSH_UPLOAD_KEY --secret SSH_KNOWN_HOSTS +deploy-appimage-nightly
when:
branch: [main]
branch: [ main ]
docker:nightly:
group: push
image: earthly/earthly:v0.7.15
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD]
secrets: [ EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD, SENTRY_AUTH_TOKEN ]
commands:
- echo $${DOCKER_PASSWORD}| docker login --username $${DOCKER_USER} --password-stdin
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
@ -122,57 +76,86 @@ pipeline:
- earthly --push +docker-minimal
- earthly --push +docker
when:
branch: [main]
branch: [ main ]
docker:release:
group: push
image: earthly/earthly:v0.7.15
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD]
secrets: [ EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD, SENTRY_AUTH_TOKEN ]
commands:
- echo $${DOCKER_PASSWORD}| docker login --username $${DOCKER_USER} --password-stdin
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly --push +docker-minimal --tag=latest
- earthly --push +docker-minimal --tag=$CI_COMMIT_TAG
- earthly --push +docker --tag=latest
- earthly --push +docker --tag=$CI_COMMIT_TAG
- earthly --strict --push +docker-minimal --tag=latest
- earthly --strict --push +docker-minimal --tag=$CI_COMMIT_TAG
- earthly --strict --push +docker --tag=latest
- earthly --strict --push +docker --tag=$CI_COMMIT_TAG
when:
event: [tag]
event: [ tag ]
jslib:publish:
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [ EARTHLY_CONFIGURATION, NPM_TOKEN ]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly --strict --push --secret NPM_TOKEN +publish-js-all
when:
branch: [ main ]
gitea:prerelease:
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [ EARTHLY_CONFIGURATION, DOCKER_USER, DOCKER_PASSWORD ]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly -a +current-changelog/CHANGELOG_CURRENT.md CHANGELOG_CURRENT.md
- rm -rf dist
when:
event: [ tag ]
appimage:release:
image: earthly/earthly:v0.8.3
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [ EARTHLY_CONFIGURATION, REGISTRY, REGISTRY_USER, REGISTRY_PASSWORD, SENTRY_AUTH_TOKEN ]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- mkdir -p dist/
- earthly --strict -a '+appimage-signed/*' dist/
when:
event: [ tag ]
# todo: webext
gitea:release:
group: push
image: woodpeckerci/plugin-gitea-release
settings:
base_url: https://git.thm.place
files:
- "dist/*.AppImage"
- "dist/*.asc"
- webext/web-ext-artifacts/*.zip
- "dist/*"
checksum: sha512
api_key:
from_secret: woodpecker_api_key
target: main
note: CHANGELOG_CURRENT.md
when:
event: [tag]
jslib:publish:
group: push
image: earthly/earthly:v0.7.15
volumes:
- /var/run/docker.sock:/var/run/docker.sock
environment:
- FORCE_COLOR=1
- EARTHLY_EXEC_CMD="/bin/sh"
secrets: [EARTHLY_CONFIGURATION, NPM_TOKEN]
commands:
- mkdir ~/.earthly && echo "$EARTHLY_CONFIGURATION" > ~/.earthly/config.yaml
- earthly bootstrap
- earthly --push --secret NPM_TOKEN +publish-js-all
when:
branch: [main]
event: [ tag ]

View File

@ -2,11 +2,66 @@
All notable changes to this project will be documented in this file.
## [0.0.76] - 2024-02-06
### Bug Fixes
- [JSLIB]: Fix types for `putBlob()`, returns a single address
### Features
- [WEBUI,JSLIB]: Upload progress
- [WEBUI]: Files can be added or removed from the upload dialog
- [WEBUI]: Select all uploaded files when done
- [WEBUI]: Start upload on Enter press
### Operations & Development
- Enable CACHE
- --force pnpm install, DRY Earthfile slightly
- Cache all rust earthly targets
- Get rid of AppImage upload to S3
- Update Earthly image version
- Remove parallelization
- [WEBUI]: Force rebundling of dependencies for `dev` script
- Intellij dev config builds jslib before webui launch
- Git ignore uploaded files in example_vault
### Styling
- [WEBUI]: Upload progress bar spacing, hide add button
### Build
- [WEBEXT]: Update shared paths with webui, fix build
- Further refactor Earthfile & build process
- Fix upend-bin target
## [0.0.75] - 2024-02-02
### Bug Fixes
- [WEBUI]: Fix upload, re-add forgotten components (Footer, AddModal, DropPasteHandler)
### Operations & Development
- Update Earthly image version
### Refactor
- [WEBUI]: Fix typo, rename ProgessBar -> ProgressBar
### Styling
- [WEBUI]: Fix uneven heights of roots
## [0.0.74] - 2024-01-28
### Bug Fixes
- [CLI]: Serve new SPA version
- [WEBUI]: Selector race conditions / wonkiness
- [CLI]: Serving web ui in Docker/AppImage
- [WEBUI]: Ordering of attributes in Selector
- [JSLIB]: Correct types for `UpObject.attr()`
@ -15,6 +70,10 @@ All notable changes to this project will be documented in this file.
- [JSLIB]: Add timeouts / aborts to all api calls
- [WEBUI]: Required & optional attributes
### Miscellaneous
- [WEBUI]: Put /dist into .eslintignore
### Operations & Development
- [WEBUI]: Fix HMR

55
Cargo.lock generated
View File

@ -487,6 +487,18 @@ version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bddcadddf5e9015d310179a59bb28c4d4b9920ad0f11e8e14dbadf654890c9a6"
[[package]]
name = "argon2"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c3610892ee6e0cbce8ae2700349fcf8f98adb0dbfbee85aec3c9179d29cc072"
dependencies = [
"base64ct",
"blake2",
"cpufeatures",
"password-hash",
]
[[package]]
name = "arrayref"
version = "0.3.7"
@ -555,6 +567,12 @@ version = "0.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "414dcefbc63d77c526a76b3afcf6fbb9b5e2791c19c3aa2297733208750c6e53"
[[package]]
name = "base64ct"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c3c1a368f70d6cf7302d78f8f7093da241fb8e8807c05cc9e51a125895a6d5b"
[[package]]
name = "bitflags"
version = "1.3.2"
@ -567,6 +585,15 @@ version = "2.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b4682ae6287fcf752ecaabbfcc7b6f9b72aa33933dc23a554d853aea8eea8635"
[[package]]
name = "blake2"
version = "0.10.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46502ad458c9a52b69d4d4d32775c788b7a1b85e8bc9d482d92250fc0e3f8efe"
dependencies = [
"digest",
]
[[package]]
name = "blake2b_simd"
version = "1.0.1"
@ -863,9 +890,9 @@ dependencies = [
[[package]]
name = "cpufeatures"
version = "0.2.9"
version = "0.2.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a17b76ff3a4162b0b27f354a0c87015ddad39d35f9c0c36607a3bdd175dde1f1"
checksum = "53fe5e26ff1b7aef8bca9c6080520cfb8d9333c7568e1829cef191a9723e5504"
dependencies = [
"libc",
]
@ -1110,6 +1137,7 @@ checksum = "9ed9a281f7bc9b7576e61468ba615a66a5c8cfdff42420a70aa82701a3b1e292"
dependencies = [
"block-buffer",
"crypto-common",
"subtle",
]
[[package]]
@ -1745,9 +1773,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.147"
version = "0.2.153"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b4668fb0ea861c1df094127ac5f1da3409a82116a4ba74fca2e58ef927159bb3"
checksum = "9c198f91728a82281a64e1f4f9eeb25d82cb32a5de251c6bd1b5154d63a8e7bd"
[[package]]
name = "libsqlite3-sys"
@ -2254,6 +2282,17 @@ version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "944553dd59c802559559161f9816429058b869003836120e262e8caec061b7ae"
[[package]]
name = "password-hash"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "346f04948ba92c43e8469c1ee6736c7563d71012b17d40745260fe106aac2166"
dependencies = [
"base64ct",
"rand_core",
"subtle",
]
[[package]]
name = "paste"
version = "1.0.14"
@ -2914,6 +2953,12 @@ version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623"
[[package]]
name = "subtle"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81cdd64d312baedb58e21336b31bc043b77e01cc99033ce76ef539f78e965ebc"
[[package]]
name = "syn"
version = "1.0.109"
@ -3379,6 +3424,7 @@ name = "upend-db"
version = "0.0.2"
dependencies = [
"anyhow",
"argon2",
"chrono",
"diesel",
"diesel_migrations",
@ -3394,6 +3440,7 @@ dependencies = [
"nonempty",
"num_cpus",
"once_cell",
"password-hash",
"rayon",
"regex",
"serde",

View File

@ -1,3 +1,3 @@
[workspace]
members = ["base", "db", "cli", "tools/upend_wasm"]
members = ["base", "db", "cli", "wasm"]
resolver = "2"

290
Earthfile
View File

@ -1,4 +1,86 @@
VERSION 0.7
VERSION 0.8
# Base targets
base-rust:
FROM rust:bookworm
RUN rustup component add clippy
RUN curl -LsSf https://get.nexte.st/latest/linux | tar zxf - -C /usr/local/cargo/bin
RUN cargo install wasm-pack wasm-bindgen-cli && rustup target add wasm32-unknown-unknown
RUN cargo install cargo-audit
WORKDIR /upend
CACHE $HOME/.cargo
COPY Cargo.toml Cargo.lock .
COPY base/Cargo.toml base/Cargo.toml
COPY cli/Cargo.toml cli/Cargo.toml
COPY db/Cargo.toml db/Cargo.toml
COPY wasm/Cargo.toml wasm/Cargo.toml
RUN cargo fetch --locked
base-backend:
FROM +base-rust
COPY --dir base cli db wasm .
base-node:
FROM node:lts-iron
RUN npm install -g pnpm
WORKDIR /upend
CACHE $HOME/.local/share/pnpm
COPY +wasmlib/pkg-web wasm/pkg-web
COPY +wasmlib/pkg-node wasm/pkg-node
COPY sdks/js/package.json sdks/js/pnpm-lock.yaml sdks/js/
RUN cd sdks/js && rm -rf node_modules && pnpm install --frozen-lockfile
COPY webui/package.json webui/pnpm-lock.yaml webui/
RUN cd webui && rm -rf node_modules && pnpm install --frozen-lockfile
COPY --dir webui webext .
COPY --dir sdks/js sdks/
base-frontend:
FROM +base-node
COPY +jslib/dist sdks/js/dist
WORKDIR webui
RUN rm -rf node_modules && pnpm install --frozen-lockfile
# Intermediate targets
upend-bin:
FROM +base-backend
CACHE --id=rust-target target
COPY +git-version/version.txt .
RUN UPEND_VERSION=$(cat version.txt) cargo build --release
RUN cp target/release/upend upend.bin
SAVE ARTIFACT upend.bin upend
webui:
FROM +base-frontend
RUN pnpm build
SAVE ARTIFACT dist
wasmlib:
FROM --platform=linux/amd64 +base-rust
COPY --dir base wasm .
WORKDIR wasm
CACHE target
RUN wasm-pack build --target web --out-dir pkg-web && \
wasm-pack build --target nodejs --out-dir pkg-node
RUN sed -e 's%"name": "upend_wasm"%"name": "@upnd/wasm-web"%' -i pkg-web/package.json && \
sed -e 's%"name": "upend_wasm"%"name": "@upnd/wasm-node"%' -i pkg-node/package.json
SAVE ARTIFACT pkg-web
SAVE ARTIFACT pkg-node
jslib:
FROM +base-node
WORKDIR sdks/js
RUN pnpm build
SAVE ARTIFACT dist
webext:
FROM +base-node
WORKDIR webext
RUN pnpm build
SAVE ARTIFACT web-ext-artifacts/*.zip
# Final targets
appimage:
FROM debian:bookworm
@ -12,7 +94,16 @@ appimage:
COPY build/AppImageBuilder.yml .
RUN sed -e "s/latest/$(./AppDir/usr/bin/upend --version | cut -d ' ' -f 2)/" -i AppImageBuilder.yml
RUN pipx run appimage-builder
SAVE ARTIFACT UpEnd* AS LOCAL dist/
SAVE ARTIFACT UpEnd*
appimage-signed:
FROM alpine
RUN apk add gpg gpg-agent
RUN --secret GPG_SIGN_KEY echo "$GPG_SIGN_KEY" | gpg --import
COPY +appimage/*.AppImage .
RUN gpg --detach-sign --sign --armor *.AppImage
SAVE ARTIFACT *.AppImage
SAVE ARTIFACT *.asc
docker-minimal:
FROM debian:bookworm
@ -39,7 +130,7 @@ docker:
SAVE IMAGE --push upend/upend:$tag
DOCKER_COMMON:
COMMAND
FUNCTION
COPY +upend-bin/upend /usr/bin/upend
COPY --dir +webui/dist /usr/share/upend/webui
ENTRYPOINT ["/usr/bin/upend"]
@ -48,43 +139,7 @@ DOCKER_COMMON:
ENV UPEND_NO_DESKTOP=true
ENV UPEND_ALLOW_HOST='*'
upend-bin:
FROM +base-rust-backend
RUN cargo build --release
COPY +git-version/version.txt .
RUN UPEND_VERSION=$(cat version.txt) cargo build --release
SAVE ARTIFACT target/release/upend upend
webui:
FROM +base-node
COPY +jslib/jslib tools/upend_js
WORKDIR webui
RUN pnpm install --frozen-lockfile
RUN pnpm build
SAVE ARTIFACT dist
wasmlib:
FROM --platform=linux/amd64 +base-rust
COPY --dir base .
COPY --dir tools/upend_wasm tools/
WORKDIR tools/upend_wasm
RUN wasm-pack build --target web --out-dir pkg-web && \
wasm-pack build --target nodejs --out-dir pkg-node
RUN sed -e 's%"name": "upend_wasm"%"name": "@upnd/wasm-web"%' -i pkg-web/package.json && \
sed -e 's%"name": "upend_wasm"%"name": "@upnd/wasm-node"%' -i pkg-node/package.json
SAVE ARTIFACT pkg-web AS LOCAL tools/upend_wasm/pkg-web
SAVE ARTIFACT pkg-node AS LOCAL tools/upend_wasm/pkg-node
jslib:
FROM +base-node
WORKDIR tools/upend_js
RUN pnpm build
SAVE ARTIFACT . jslib
webext:
FROM +base-node
WORKDIR webext
RUN pnpm build
# CI targets
lint:
WAIT
@ -94,19 +149,17 @@ lint:
END
lint-backend:
FROM +base-rust-backend
FROM +base-backend
CACHE --id=rust-target target
RUN cargo clippy --workspace
lint-frontend:
FROM +base-node
COPY +jslib/jslib tools/upend_js
WORKDIR webui
RUN pnpm install --frozen-lockfile
FROM +base-frontend
RUN pnpm check && pnpm lint
lint-jslib:
FROM +base-node
WORKDIR tools/upend_js
WORKDIR sdks/js
RUN pnpm lint
audit:
@ -116,39 +169,31 @@ audit:
END
audit-backend:
FROM +base-rust-backend
FROM +base-backend
CACHE --id=rust-target target
RUN cargo audit --workspace
audit-frontend:
FROM +base-node
COPY +jslib/jslib tools/upend_js
WORKDIR webui
RUN pnpm install --frozen-lockfile
FROM +base-frontend
RUN pnpm audit
test:
WAIT
BUILD +test-jslib
BUILD +test-backend
BUILD +test-jslib
END
test-backend:
FROM +base-backend
CACHE --id=rust-target target
RUN cargo nextest run --workspace
test-jslib:
FROM +base-node
WORKDIR tools/upend_js
WORKDIR sdks/js
RUN pnpm build && pnpm test
test-backend:
FROM +base-rust-backend
RUN cargo nextest run --workspace
appimage-signed:
FROM alpine
RUN apk add gpg gpg-agent
RUN --secret GPG_SIGN_KEY echo "$GPG_SIGN_KEY" | gpg --import
COPY +appimage/*.AppImage .
RUN gpg --detach-sign --sign --armor *.AppImage
SAVE ARTIFACT *.AppImage
SAVE ARTIFACT *.asc
# Deployment targets
deploy-appimage-nightly:
FROM alpine
@ -161,90 +206,83 @@ deploy-appimage-nightly:
chmod 600 $HOME/.ssh/*
COPY +appimage-signed/* .
RUN --push scp -v *.AppImage *.asc mainsite:releases/nightly
publish-js-all:
WAIT
BUILD +publish-js-wasm
BUILD +publish-js-lib
END
publish-js-lib:
FROM +base-npm-publish
WORKDIR /upend/sdks/js
DO +NPM_PUBLISH --pkg_name=@upnd/upend
publish-js-wasm:
FROM +base-npm-publish
WORKDIR /upend/wasm/pkg-web
DO +NPM_PUBLISH --pkg_name=@upnd/wasm-web
WORKDIR /upend/wasm/pkg-node
DO +NPM_PUBLISH --pkg_name=@upnd/wasm-node
base-npm-publish:
FROM +base-node
RUN --secret NPM_TOKEN echo "//registry.npmjs.org/:_authToken=$NPM_TOKEN" > $HOME/.npmrc
COPY +jslib/dist sdks/js/dist
NPM_PUBLISH:
COMMAND
FUNCTION
ARG pkg_name
IF [ "`npm view $pkg_name version`" != "`node -p \"require('./package.json').version\"`" ]
IF --no-cache [ "`npm view $pkg_name version`" != "`node -p \"require('./package.json').version\"`" ]
RUN echo "Publishing $pkg_name to npm..."
RUN --push npm publish --access public
ELSE
RUN echo "Nothing to do for $pkg_name."
END
base-npm-publish:
FROM +base-node
RUN --secret NPM_TOKEN echo "//registry.npmjs.org/:_authToken=$NPM_TOKEN" > $HOME/.npmrc
COPY +jslib/jslib tools/upend_js
publish-js-lib:
FROM +base-npm-publish
WORKDIR /upend/tools/upend_js
DO +NPM_PUBLISH --pkg_name=@upnd/upend
publish-js-wasm:
FROM +base-npm-publish
WORKDIR /upend/tools/upend_wasm/pkg-web
DO +NPM_PUBLISH --pkg_name=@upnd/wasm-web
WORKDIR /upend/tools/upend_wasm/pkg-node
DO +NPM_PUBLISH --pkg_name=@upnd/wasm-node
publish-js-all:
BUILD +publish-js-wasm
BUILD +publish-js-lib
# Utility targets
git-version:
LOCALLY
RUN ./build/get_version.sh | tee /tmp/upend_version.txt
FROM debian:bookworm
RUN apt-get update && \
apt-get -y install git && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
COPY build/get_version.sh build/get_version.sh
COPY .git .git
RUN ./build/get_version.sh > /tmp/upend_version.txt && cat /tmp/upend_version.txt
SAVE ARTIFACT /tmp/upend_version.txt version.txt
RUN rm /tmp/upend_version.txt
changelog:
FROM orhunp/git-cliff
COPY .git .git
RUN git-cliff -o CHANGELOG.md
RUN git-cliff --bump -o CHANGELOG.md
SAVE ARTIFACT CHANGELOG.md
current-changelog:
FROM +changelog
FROM orhunp/git-cliff
COPY .git .git
RUN git-cliff --current -o CHANGELOG_CURRENT.md
SAVE ARTIFACT CHANGELOG_CURRENT.md AS LOCAL CHANGELOG_CURRENT.md
SAVE ARTIFACT CHANGELOG_CURRENT.md
update-changelog:
LOCALLY
COPY +changelog/CHANGELOG.md .
RUN git add CHANGELOG.md && git commit -m "Update CHANGELOG"
RUN git add CHANGELOG.md && git commit -m "release: Update CHANGELOG"
RUN --push git push
base-rust:
FROM rust:bookworm
RUN rustup component add clippy
RUN curl -LsSf https://get.nexte.st/latest/linux | tar zxf - -C /usr/local/cargo/bin
RUN cargo install wasm-pack wasm-bindgen-cli && rustup target add wasm32-unknown-unknown
RUN cargo install cargo-audit
WORKDIR /upend
COPY Cargo.toml Cargo.lock .
COPY base/Cargo.toml base/Cargo.toml
COPY cli/Cargo.toml cli/Cargo.toml
COPY db/Cargo.toml db/Cargo.toml
COPY tools/upend_wasm/Cargo.toml tools/upend_wasm/Cargo.toml
RUN cargo fetch --locked
base-rust-backend:
FROM +base-rust
COPY --dir base cli db .
COPY --dir tools/upend_wasm tools/
base-node:
FROM node:lts
RUN npm install -g pnpm
WORKDIR /upend
COPY +wasmlib/pkg-web tools/upend_wasm/pkg-web
COPY +wasmlib/pkg-node tools/upend_wasm/pkg-node
COPY tools/upend_js/package.json tools/upend_js/pnpm-lock.yaml tools/upend_js/
RUN cd tools/upend_js && pnpm install --frozen-lockfile
COPY webui/package.json webui/pnpm-lock.yaml webui/
RUN cd webui && pnpm install --frozen-lockfile
COPY --dir webui webext .
COPY --dir tools/upend_js tools/
dev-local:
FROM debian:bookworm
COPY +jslib/dist /js-dist
COPY +wasmlib/pkg-web /wasm-web
COPY +wasmlib/pkg-node /wasm-node
SAVE ARTIFACT /js-dist AS LOCAL sdks/js/dist
SAVE ARTIFACT /wasm-web AS LOCAL wasm/pkg-web
SAVE ARTIFACT /wasm-node AS LOCAL wasm/pkg-node
dev-update-sdk:
LOCALLY
WORKDIR sdks/js
RUN pnpm build
WORKDIR webui
RUN pnpm install

View File

@ -1,176 +0,0 @@
# https://taskfile.dev
version: "3"
tasks:
default:
cmds:
- task: lint
- task: test
- task: build
lint:
deps: [lint:frontend, lint:backend, lint:jslib, lint:webext]
lint:frontend:
dir: webui
deps: [setup:frontend]
sources:
- ./**/*.ts
- ./**/*.svelte
cmds:
- pnpm check
- pnpm lint
lint:backend:
sources:
- ./**/Cargo.toml
- ./**/*.rs
cmds:
- cargo clippy --workspace
lint:jslib:
deps: [build:jslib]
dir: tools/upend_js
sources:
- ./*.ts
- ./package.lock
cmds:
- pnpm lint
lint:webext:
dir: webext
deps: [setup:webext]
sources:
- ./**/*.ts
- ./**/*.svelte
cmds:
- pnpm lint
test:
deps: [test:backend]
test:backend:
sources:
- ./**/Cargo.toml
- ./**/*.rs
cmds:
- cargo nextest run --workspace
build:
deps: [build:frontend, build:backend, build:webext]
build:backend:
dir: cli
sources:
- ./**/Cargo.toml
- ./**/*.rs
cmds:
- cargo build --release
build:frontend:
dir: webui
deps: [setup:frontend]
sources:
- ./**/*.ts
- ./**/*.svelte
cmds:
- pnpm build
build:jslib:
deps: [build:wasmlib]
dir: tools/upend_js
cmds:
- pnpm install --frozen-lockfile
- pnpm build
build:wasmlib:
dir: tools/upend_wasm
cmds:
- wasm-pack build --target web
setup:frontend:
deps: [build:jslib]
dir: webui
sources:
- ../tools/upend_js/*.js
- package.lock
cmds:
- pnpm install --frozen-lockfile
build:webext:
deps: [setup:webext]
dir: webext
sources:
- ./**/*.ts
- ./**/*.svelte
cmds:
- pnpm build
build:webext:package-sources:
deps: [build:webext]
cmds:
- mkdir -p webext/web-ext-artifacts/tmp/upend/
- git ls-files -z tools/upend_js | xargs -0 cp --parents -t webext/web-ext-artifacts/tmp/upend
- git ls-files -z webext | xargs -0 cp --parents -t webext/web-ext-artifacts/tmp/upend
- cd webext/web-ext-artifacts/tmp/ && zip -vr ../upend-webext-sources.zip upend
setup:webext:
deps: [build:jslib]
dir: webext
sources:
- ../tools/upend_js/*.js
- package.lock
cmds:
- pnpm install --frozen-lockfile
dev:update_tool_deps:
cmds:
- task: build:jslib
- rm -vrf webui/node_modules/.vite/deps
dev:
deps: [dev:backend, dev:frontend]
dev:backend:
dir: cli
cmds:
- cargo run --release -- serve ../example_vault --clean --no-browser --reinitialize
dev:frontend:
deps: [build:jslib]
dir: webui
cmds:
- pnpm dev --open
clean:
deps:
[clean:backend, clean:frontend, clean:webext, clean:tools, clean:vault]
clean:backend:
cmds:
- cargo clean
clean:frontend:
cmds:
- rm -rf webui/node_modules
- rm -vrf webui/dist webui/public/vendor
clean:webext:
cmds:
- rm -rf webext/node_modules
- rm -vrf webext/web-ext-artifacts webext/dist
clean:tools:
cmds:
- rm -vrf tools/upend_js/*.js
clean:vault:
cmds:
- rm -vrf example_vault/.upend
update-schema:
cmds:
- rm -f upend.sqlite3
- diesel migration run --migration-dir migrations/upend/
- diesel print-schema > src/database/inner/schema.rs

View File

@ -1,3 +1,4 @@
use crate::entry::Attribute;
use crate::error::{AddressComponentsDecodeError, UpEndError};
use crate::hash::{
b58_decode, b58_encode, AsMultihash, AsMultihashError, LargeMultihash, UpMultihash, IDENTITY,
@ -18,7 +19,7 @@ use wasm_bindgen::prelude::*;
pub enum Address {
Hash(UpMultihash),
Uuid(Uuid),
Attribute(String),
Attribute(Attribute),
Url(Url),
}
@ -62,7 +63,7 @@ impl Address {
),
Self::Attribute(attribute) => (
UP_ATTRIBUTE,
LargeMultihash::wrap(IDENTITY, attribute.as_bytes())
LargeMultihash::wrap(IDENTITY, attribute.to_string().as_bytes())
.map_err(UpEndError::from_any)?,
),
Self::Url(url) => (
@ -102,9 +103,14 @@ impl Address {
UP_UUID => Ok(Address::Uuid(
Uuid::from_slice(digest.as_slice()).map_err(UpEndError::from_any)?,
)),
UP_ATTRIBUTE => Ok(Address::Attribute(
String::from_utf8(digest).map_err(UpEndError::from_any)?,
)),
UP_ATTRIBUTE => {
let attribute = String::from_utf8(digest).map_err(UpEndError::from_any)?;
if attribute.is_empty() {
Ok(Address::Attribute(Attribute::null()))
} else {
Ok(Address::Attribute(attribute.parse()?))
}
}
UP_URL => Ok(Address::Url(
Url::parse(&String::from_utf8(digest).map_err(UpEndError::from_any)?)
.map_err(UpEndError::from_any)?,
@ -120,7 +126,7 @@ impl Address {
let (entity_type, entity_content) = match self {
Address::Hash(uphash) => ("Hash", Some(b58_encode(uphash.to_bytes()))),
Address::Uuid(uuid) => ("Uuid", Some(uuid.to_string())),
Address::Attribute(attribute) => ("Attribute", Some(attribute.clone())),
Address::Attribute(attribute) => ("Attribute", Some(attribute.to_string())),
Address::Url(url) => ("Url", Some(url.to_string())),
};
@ -133,11 +139,12 @@ impl Address {
pub fn from_components(components: AddressComponents) -> Result<Self, UpEndError> {
// TODO: make this automatically derive from `Address` definition
let address = match components {
AddressComponents { t, c } if t == "Attribute" => {
Address::Attribute(c.ok_or(UpEndError::AddressComponentsDecodeError(
AddressComponents { t, c } if t == "Attribute" => Address::Attribute(
c.ok_or(UpEndError::AddressComponentsDecodeError(
AddressComponentsDecodeError::MissingValue,
))?)
}
))?
.parse()?,
),
AddressComponents { t, c } if t == "Url" => Address::Url(if let Some(string) = c {
Url::parse(&string).map_err(|e| {
UpEndError::AddressComponentsDecodeError(
@ -250,6 +257,9 @@ mod tests {
use uuid::Uuid;
use crate::addressing::{Address, IDENTITY};
use crate::constants::{
TYPE_ATTRIBUTE_ADDRESS, TYPE_HASH_ADDRESS, TYPE_URL_ADDRESS, TYPE_UUID_ADDRESS,
};
use crate::hash::{LargeMultihash, UpMultihash};
use super::UpEndError;
@ -262,6 +272,11 @@ mod tests {
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, decoded);
let addr = &*TYPE_HASH_ADDRESS;
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, &decoded);
Ok(())
}
@ -271,15 +286,25 @@ mod tests {
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, decoded);
let addr = &*TYPE_UUID_ADDRESS;
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, &decoded);
Ok(())
}
#[test]
fn test_attribute_codec() -> Result<(), UpEndError> {
let addr = Address::Attribute(String::from("ATTRIBUTE"));
let addr = Address::Attribute("ATTRIBUTE".parse().unwrap());
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, decoded);
let addr = &*TYPE_ATTRIBUTE_ADDRESS;
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, &decoded);
Ok(())
}
@ -289,6 +314,11 @@ mod tests {
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, decoded);
let addr = &*TYPE_URL_ADDRESS;
let encoded = addr.encode()?;
let decoded = Address::decode(&encoded)?;
assert_eq!(addr, &decoded);
Ok(())
}
}

View File

@ -1,4 +1,5 @@
use crate::addressing::Address;
use crate::entry::Attribute;
use crate::entry::InvariantEntry;
use crate::hash::{LargeMultihash, UpMultihash};
@ -19,13 +20,13 @@ pub const ATTR_KEY: &str = "KEY";
lazy_static! {
pub static ref HIER_ROOT_INVARIANT: InvariantEntry = InvariantEntry {
attribute: String::from(ATTR_KEY),
attribute: ATTR_KEY.parse().unwrap(),
value: "HIER_ROOT".into(),
};
pub static ref HIER_ROOT_ADDR: Address = HIER_ROOT_INVARIANT.entity().unwrap();
pub static ref TYPE_HASH_ADDRESS: Address =
Address::Hash(UpMultihash::from(LargeMultihash::default()));
pub static ref TYPE_UUID_ADDRESS: Address = Address::Uuid(uuid::Uuid::nil());
pub static ref TYPE_ATTRIBUTE_ADDRESS: Address = Address::Attribute("".to_string());
pub static ref TYPE_ATTRIBUTE_ADDRESS: Address = Address::Attribute(Attribute::null());
pub static ref TYPE_URL_ADDRESS: Address = Address::Url(url::Url::parse("up:").unwrap());
}

View File

@ -1,19 +1,55 @@
use crate::addressing::Address;
use crate::error::UpEndError;
use crate::hash::{b58_decode, sha256hash, AsMultihash, AsMultihashError, UpMultihash};
use crate::lang::Attribute;
use chrono::NaiveDateTime;
use serde::{Deserialize, Serialize};
use std::convert::TryFrom;
use std::io::{Cursor, Write};
use std::str::FromStr;
use url::Url;
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, Hash)]
pub struct Attribute(String);
impl Attribute {
pub fn null() -> Self {
Self("".to_string())
}
}
impl std::fmt::Display for Attribute {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.0)
}
}
impl FromStr for Attribute {
type Err = UpEndError;
fn from_str(value: &str) -> Result<Self, Self::Err> {
if value.is_empty() {
Err(UpEndError::EmptyAttribute)
} else {
Ok(Self(value.to_uppercase()))
}
}
}
impl<S> PartialEq<S> for Attribute
where
S: AsRef<str>,
{
fn eq(&self, other: &S) -> bool {
self.0.eq_ignore_ascii_case(other.as_ref())
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Entry {
pub entity: Address,
pub attribute: String,
pub attribute: Attribute,
pub value: EntryValue,
pub provenance: String,
pub user: Option<String>,
pub timestamp: NaiveDateTime,
}
@ -22,10 +58,11 @@ pub struct ImmutableEntry(pub Entry);
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct InvariantEntry {
pub attribute: String,
pub attribute: Attribute,
pub value: EntryValue,
}
#[allow(clippy::large_enum_variant)]
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(tag = "t", content = "c")]
pub enum EntryValue {
@ -36,18 +73,6 @@ pub enum EntryValue {
Invalid,
}
impl Default for Entry {
fn default() -> Self {
Self {
entity: Address::Uuid(uuid::Uuid::nil()),
attribute: Default::default(),
value: EntryValue::Null,
provenance: "SYSTEM".into(),
timestamp: NaiveDateTime::from_timestamp_opt(0, 0).unwrap(),
}
}
}
impl TryFrom<&InvariantEntry> for Entry {
type Error = UpEndError;
@ -57,7 +82,8 @@ impl TryFrom<&InvariantEntry> for Entry {
attribute: invariant.attribute.clone(),
value: invariant.value.clone(),
provenance: "INVARIANT".to_string(),
..Default::default()
user: None,
timestamp: NaiveDateTime::from_timestamp_opt(0, 0).unwrap(),
})
}
}
@ -66,7 +92,7 @@ impl InvariantEntry {
pub fn entity(&self) -> Result<Address, UpEndError> {
let mut entity = Cursor::new(vec![0u8; 0]);
entity
.write_all(self.attribute.as_bytes())
.write_all(self.attribute.0.as_bytes())
.map_err(UpEndError::from_any)?;
entity
.write_all(self.value.to_string()?.as_bytes())
@ -92,7 +118,7 @@ impl AsMultihash for Entry {
.map_err(|e| AsMultihashError(e.to_string()))?
.as_slice(),
)?;
result.write_all(self.attribute.as_bytes())?;
result.write_all(self.attribute.0.as_bytes())?;
result.write_all(
self.value
.to_string()

View File

@ -3,6 +3,7 @@ pub enum UpEndError {
HashDecodeError(String),
AddressParseError(String),
AddressComponentsDecodeError(AddressComponentsDecodeError),
EmptyAttribute,
CannotSerializeInvalid,
QueryParseError(String),
Other(String),
@ -35,6 +36,7 @@ impl std::fmt::Display for UpEndError {
String::from("Invalid EntryValues cannot be serialized."),
UpEndError::QueryParseError(err) => format!("Error parsing query: {err}"),
UpEndError::Other(err) => format!("Unknown error: {err}"),
UpEndError::EmptyAttribute => String::from("Attribute cannot be empty."),
}
)
}

View File

@ -1,4 +1,5 @@
use crate::addressing::Address;
use crate::entry::Attribute;
use crate::entry::EntryValue;
use crate::error::UpEndError;
use nonempty::NonEmpty;
@ -6,15 +7,6 @@ use std::borrow::Borrow;
use std::convert::TryFrom;
use std::str::FromStr;
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct Attribute(pub String);
impl From<&str> for Attribute {
fn from(str: &str) -> Self {
Self(str.to_string())
}
}
#[derive(Debug, Clone, PartialEq)]
pub enum QueryComponent<T>
where
@ -79,7 +71,7 @@ impl TryFrom<lexpr::Value> for Attribute {
fn try_from(value: lexpr::Value) -> Result<Self, Self::Error> {
match value {
lexpr::Value::String(str) => Ok(Attribute(str.to_string())),
lexpr::Value::String(str) => str.parse(),
_ => Err(UpEndError::QueryParseError(
"Can only convert to attribute from string.".into(),
)),
@ -87,6 +79,7 @@ impl TryFrom<lexpr::Value> for Attribute {
}
}
#[allow(clippy::large_enum_variant)]
#[derive(Debug, Clone, PartialEq)]
pub enum QueryPart {
Matches(PatternQuery),
@ -331,7 +324,7 @@ mod test {
query,
Query::SingleQuery(QueryPart::Matches(PatternQuery {
entity: QueryComponent::Variable(None),
attribute: QueryComponent::Exact("FOO".into()),
attribute: QueryComponent::Exact("FOO".parse().unwrap()),
value: QueryComponent::Variable(None)
}))
);
@ -372,7 +365,7 @@ mod test {
query,
Query::SingleQuery(QueryPart::Matches(PatternQuery {
entity: QueryComponent::Variable(None),
attribute: QueryComponent::In(vec!("FOO".into(), "BAR".into())),
attribute: QueryComponent::In(vec!("FOO".parse().unwrap(), "BAR".parse().unwrap())),
value: QueryComponent::Variable(None)
}))
);

View File

@ -1,5 +1,9 @@
#!/bin/sh
which git > /dev/null || {
echo "git not found"
exit 1
}
git_tag=$(git describe --tags --exact-match HEAD 2>/dev/null)
if [ -z "$git_tag" ]; then

View File

@ -4,5 +4,4 @@ pub struct UpEndConfig {
pub desktop_enabled: bool,
pub trust_executables: bool,
pub secret: String,
pub key: Option<String>,
}

View File

@ -13,19 +13,21 @@ use upend_db::stores::Blob;
use upend_db::{
jobs::{JobContainer, JobState},
stores::{fs::FILE_MIME_KEY, UpStore},
BlobMode, UpEndConnection,
BlobMode, OperationContext, UpEndConnection,
};
lazy_static! {
pub static ref ID3_TYPE_INVARIANT: InvariantEntry = InvariantEntry {
attribute: String::from(ATTR_KEY),
attribute: ATTR_KEY.parse().unwrap(),
value: "TYPE_ID3".into(),
};
pub static ref ID3_TYPE_LABEL: Entry = Entry {
entity: ID3_TYPE_INVARIANT.entity().unwrap(),
attribute: ATTR_LABEL.into(),
attribute: ATTR_LABEL.parse().unwrap(),
value: "ID3".into(),
..Default::default()
provenance: "INVARIANT".to_string(),
user: None,
timestamp: chrono::Utc::now().naive_utc(),
};
}
@ -38,6 +40,7 @@ impl Extractor for ID3Extractor {
connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
mut job_container: JobContainer,
context: OperationContext,
) -> Result<Vec<Entry>> {
if let Address::Hash(hash) = address {
let files = store.retrieve(hash)?;
@ -59,31 +62,33 @@ impl Extractor for ID3Extractor {
let tags = id3::Tag::read_from_path(file_path)?;
let mut result: Vec<Entry> = tags
.frames()
.flat_map(|frame| match frame.content() {
id3::Content::Text(text) => vec![
let mut result: Vec<Entry> = vec![];
for frame in tags.frames() {
if let id3::Content::Text(text) = frame.content() {
result.extend(vec![
Entry {
entity: address.clone(),
attribute: format!("ID3_{}", frame.id()),
attribute: format!("ID3_{}", frame.id()).parse()?,
value: match frame.id() {
"TYER" | "TBPM" => EntryValue::guess_from(text),
_ => text.clone().into(),
},
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
Entry {
entity: Address::Attribute(format!("ID3_{}", frame.id())),
attribute: ATTR_LABEL.into(),
entity: Address::Attribute(format!("ID3_{}", frame.id()).parse()?),
attribute: ATTR_LABEL.parse().unwrap(),
value: format!("ID3: {}", frame.name()).into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
],
_ => vec![],
})
.collect();
]);
}
}
let mut has_pictures = false;
for (idx, picture) in tags.pictures().enumerate() {
@ -96,22 +101,25 @@ impl Extractor for ID3Extractor {
Blob::from_filepath(&tmp_path),
None,
Some(BlobMode::StoreOnly),
context.clone(),
)?;
result.push(Entry {
entity: address.clone(),
attribute: "ID3_PICTURE".to_string(),
attribute: "ID3_PICTURE".parse()?,
value: EntryValue::Address(Address::Hash(hash)),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
});
has_pictures = true;
}
if has_pictures {
result.push(Entry {
entity: Address::Attribute("ID3_PICTURE".to_string()),
attribute: ATTR_LABEL.into(),
entity: Address::Attribute("ID3_PICTURE".parse()?),
attribute: ATTR_LABEL.parse().unwrap(),
value: "ID3 Embedded Image".into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
})
}
@ -123,9 +131,11 @@ impl Extractor for ID3Extractor {
.filter(|e| e.attribute != ATTR_LABEL)
.map(|e| Entry {
entity: Address::Attribute(e.attribute.clone()),
attribute: ATTR_OF.into(),
attribute: ATTR_OF.parse().unwrap(),
value: EntryValue::Address(ID3_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
})
.collect::<Vec<Entry>>(),
);
@ -134,9 +144,11 @@ impl Extractor for ID3Extractor {
ID3_TYPE_LABEL.clone(),
Entry {
entity: address.clone(),
attribute: ATTR_IN.into(),
attribute: ATTR_IN.parse().unwrap(),
value: EntryValue::Address(ID3_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
]);
}

View File

@ -3,6 +3,7 @@ use std::sync::Arc;
use super::Extractor;
use anyhow::{anyhow, Result};
use lazy_static::lazy_static;
use upend_base::entry::Attribute;
use upend_base::{
addressing::Address,
constants::{ATTR_IN, ATTR_KEY, ATTR_LABEL, ATTR_OF},
@ -11,7 +12,7 @@ use upend_base::{
use upend_db::{
jobs::{JobContainer, JobState},
stores::{fs::FILE_MIME_KEY, UpStore},
UpEndConnection,
OperationContext, UpEndConnection,
};
pub struct ExifExtractor;
@ -21,14 +22,16 @@ pub struct ExifExtractor;
lazy_static! {
pub static ref EXIF_TYPE_INVARIANT: InvariantEntry = InvariantEntry {
attribute: String::from(ATTR_KEY),
attribute: ATTR_KEY.parse().unwrap(),
value: "TYPE_EXIF".into(),
};
pub static ref EXIF_TYPE_LABEL: Entry = Entry {
entity: EXIF_TYPE_INVARIANT.entity().unwrap(),
attribute: ATTR_LABEL.into(),
attribute: ATTR_LABEL.parse().unwrap(),
value: "EXIF".into(),
..Default::default()
provenance: "INVARIANT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: None
};
}
@ -39,6 +42,7 @@ impl Extractor for ExifExtractor {
_connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
mut job_container: JobContainer,
context: OperationContext,
) -> Result<Vec<Entry>> {
if let Address::Hash(hash) = address {
let files = store.retrieve(hash)?;
@ -63,42 +67,42 @@ impl Extractor for ExifExtractor {
let exifreader = exif::Reader::new();
let exif = exifreader.read_from_container(&mut bufreader)?;
let mut result: Vec<Entry> = exif
let mut result: Vec<Entry> = vec![];
for field in exif
.fields()
.filter(|field| !matches!(field.value, exif::Value::Undefined(..)))
.flat_map(|field| {
if let Some(tag_description) = field.tag.description() {
let attribute = format!("EXIF_{}", field.tag.1);
{
if let Some(tag_description) = field.tag.description() {
let attribute: Attribute = format!("EXIF_{}", field.tag.1).parse()?;
vec![
Entry {
entity: address.clone(),
attribute: attribute.clone(),
value: match field.tag {
exif::Tag::ExifVersion => {
EntryValue::String(format!("{}", field.display_value()))
}
_ => EntryValue::guess_from(format!(
"{}",
field.display_value()
)),
},
provenance: "SYSTEM EXTRACTOR".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
result.extend(vec![
Entry {
entity: address.clone(),
attribute: attribute.clone(),
value: match field.tag {
exif::Tag::ExifVersion => {
EntryValue::String(format!("{}", field.display_value()))
}
_ => {
EntryValue::guess_from(format!("{}", field.display_value()))
}
},
Entry {
entity: Address::Attribute(attribute),
attribute: ATTR_LABEL.into(),
value: format!("EXIF: {}", tag_description).into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
},
]
} else {
vec![]
}
})
.collect();
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
Entry {
entity: Address::Attribute(attribute),
attribute: ATTR_LABEL.parse().unwrap(),
value: format!("EXIF: {}", tag_description).into(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
]);
}
}
if !result.is_empty() {
result.extend(
@ -107,9 +111,11 @@ impl Extractor for ExifExtractor {
.filter(|e| e.attribute != ATTR_LABEL)
.map(|e| Entry {
entity: Address::Attribute(e.attribute.clone()),
attribute: ATTR_OF.into(),
attribute: ATTR_OF.parse().unwrap(),
value: EntryValue::Address(EXIF_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
})
.collect::<Vec<Entry>>(),
);
@ -120,9 +126,11 @@ impl Extractor for ExifExtractor {
EXIF_TYPE_LABEL.clone(),
Entry {
entity: address.clone(),
attribute: ATTR_IN.into(),
attribute: ATTR_IN.parse().unwrap(),
value: EntryValue::Address(EXIF_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
]);
}

View File

@ -3,6 +3,7 @@ use std::{process::Command, sync::Arc};
use super::Extractor;
use anyhow::{anyhow, Result};
use lazy_static::lazy_static;
use tracing::{debug, trace};
use upend_base::{
addressing::Address,
constants::{ATTR_IN, ATTR_KEY, ATTR_LABEL, ATTR_OF},
@ -11,27 +12,31 @@ use upend_base::{
use upend_db::{
jobs::{JobContainer, JobState},
stores::{fs::FILE_MIME_KEY, UpStore},
UpEndConnection,
OperationContext, UpEndConnection,
};
const DURATION_KEY: &str = "MEDIA_DURATION";
lazy_static! {
pub static ref MEDIA_TYPE_INVARIANT: InvariantEntry = InvariantEntry {
attribute: String::from(ATTR_KEY),
attribute: ATTR_KEY.parse().unwrap(),
value: "TYPE_MEDIA".into(),
};
pub static ref MEDIA_TYPE_LABEL: Entry = Entry {
entity: MEDIA_TYPE_INVARIANT.entity().unwrap(),
attribute: ATTR_LABEL.into(),
attribute: ATTR_LABEL.parse().unwrap(),
value: "Multimedia".into(),
..Default::default()
provenance: "INVARIANT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: None,
};
pub static ref DURATION_OF_MEDIA: Entry = Entry {
entity: Address::Attribute(DURATION_KEY.to_string()),
attribute: ATTR_OF.into(),
entity: Address::Attribute(DURATION_KEY.parse().unwrap()),
attribute: ATTR_OF.parse().unwrap(),
value: EntryValue::Address(MEDIA_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: "INVARIANT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: None,
};
}
@ -44,6 +49,7 @@ impl Extractor for MediaExtractor {
_connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
mut job_container: JobContainer,
context: OperationContext,
) -> Result<Vec<Entry>> {
if let Address::Hash(hash) = address {
let files = store.retrieve(hash)?;
@ -90,9 +96,10 @@ impl Extractor for MediaExtractor {
let result = vec![
Entry {
entity: address.clone(),
attribute: DURATION_KEY.to_string(),
attribute: DURATION_KEY.parse().unwrap(),
value: EntryValue::Number(duration),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
(&MEDIA_TYPE_INVARIANT as &InvariantEntry)
@ -102,9 +109,11 @@ impl Extractor for MediaExtractor {
DURATION_OF_MEDIA.clone(),
Entry {
entity: address.clone(),
attribute: ATTR_IN.into(),
attribute: ATTR_IN.parse().unwrap(),
value: EntryValue::Address(MEDIA_TYPE_INVARIANT.entity().unwrap()),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
];

View File

@ -6,7 +6,9 @@ use std::{
};
use tracing::{debug, info, trace};
use upend_base::{addressing::Address, entry::Entry};
use upend_db::{jobs::JobContainer, stores::UpStore, UpEndConnection, UpEndDatabase};
use upend_db::{
jobs::JobContainer, stores::UpStore, OperationContext, UpEndConnection, UpEndDatabase,
};
#[cfg(feature = "extractors-web")]
pub mod web;
@ -27,6 +29,7 @@ pub trait Extractor {
connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
job_container: JobContainer,
context: OperationContext,
) -> Result<Vec<Entry>>;
fn is_needed(&self, _address: &Address, _connection: &UpEndConnection) -> Result<bool> {
@ -39,9 +42,10 @@ pub trait Extractor {
connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
job_container: JobContainer,
context: OperationContext,
) -> Result<usize> {
if self.is_needed(address, connection)? {
let entries = self.get(address, connection, store, job_container)?;
let entries = self.get(address, connection, store, job_container, context)?;
trace!("For \"{address}\", got: {entries:?}");
connection.transaction(|| {
@ -62,6 +66,7 @@ pub fn extract_all<D: Borrow<UpEndDatabase>>(
db: D,
store: Arc<Box<dyn UpStore + Send + Sync>>,
mut job_container: JobContainer,
context: OperationContext,
) -> Result<usize> {
info!("Extracting metadata for all addresses.");
@ -77,7 +82,13 @@ pub fn extract_all<D: Borrow<UpEndDatabase>>(
.par_iter()
.map(|address| {
let connection = db.connection()?;
let entry_count = extract(address, &connection, store.clone(), job_container.clone());
let entry_count = extract(
address,
&connection,
store.clone(),
job_container.clone(),
context.clone(),
);
let mut cnt = count.write().unwrap();
*cnt += 1;
@ -107,6 +118,7 @@ pub fn extract(
connection: &UpEndConnection,
store: Arc<Box<dyn UpStore + Send + Sync>>,
job_container: JobContainer,
context: OperationContext,
) -> usize {
let mut entry_count = 0;
trace!("Extracting metadata for {address:?}");
@ -118,6 +130,7 @@ pub fn extract(
connection,
store.clone(),
job_container.clone(),
context.clone(),
);
match extract_result {
@ -133,6 +146,7 @@ pub fn extract(
connection,
store.clone(),
job_container.clone(),
context.clone(),
);
match extract_result {
@ -148,6 +162,7 @@ pub fn extract(
connection,
store.clone(),
job_container.clone(),
context.clone(),
);
match extract_result {
@ -158,8 +173,13 @@ pub fn extract(
#[cfg(feature = "extractors-media")]
{
let extract_result =
media::MediaExtractor.insert_info(address, connection, store.clone(), job_container);
let extract_result = media::MediaExtractor.insert_info(
address,
connection,
store.clone(),
job_container,
context.clone(),
);
match extract_result {
Ok(count) => entry_count += count,

View File

@ -14,7 +14,7 @@ use upend_base::entry::EntryValue;
use upend_db::jobs::JobContainer;
use upend_db::jobs::JobState;
use upend_db::stores::UpStore;
use upend_db::UpEndConnection;
use upend_db::{OperationContext, UpEndConnection};
use webpage::HTML;
pub struct WebExtractor;
@ -26,6 +26,7 @@ impl Extractor for WebExtractor {
_connection: &UpEndConnection,
_store: Arc<Box<dyn UpStore + Send + Sync>>,
mut job_container: JobContainer,
context: OperationContext,
) -> Result<Vec<Entry>> {
if let Address::Url(url) = address {
let mut job_handle =
@ -40,23 +41,26 @@ impl Extractor for WebExtractor {
let mut entries = vec![
html.title.as_ref().map(|html_title| Entry {
entity: address.clone(),
attribute: "HTML_TITLE".to_string(),
attribute: "HTML_TITLE".parse().unwrap(),
value: html_title.clone().into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}),
html.title.map(|html_title| Entry {
entity: address.clone(),
attribute: ATTR_LABEL.to_string(),
attribute: ATTR_LABEL.parse().unwrap(),
value: html_title.into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}),
html.description.map(|html_desc| Entry {
entity: address.clone(),
attribute: "HTML_DESCRIPTION".to_string(),
attribute: "HTML_DESCRIPTION".parse().unwrap(),
value: html_desc.into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}),
];
@ -65,27 +69,30 @@ impl Extractor for WebExtractor {
if attribute == "OG_TITLE" {
entries.push(Some(Entry {
entity: address.clone(),
attribute: ATTR_LABEL.to_string(),
attribute: ATTR_LABEL.parse()?,
value: value.clone().into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}));
}
entries.push(Some(Entry {
entity: address.clone(),
attribute,
attribute: attribute.parse()?,
value: value.into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}));
}
for image in html.opengraph.images {
entries.push(Some(Entry {
entity: address.clone(),
attribute: "OG_IMAGE".to_string(),
attribute: "OG_IMAGE".parse()?,
value: image.url.into(),
provenance: "SYSTEM EXTRACTOR".to_string(),
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}))
}
@ -99,10 +106,11 @@ impl Extractor for WebExtractor {
vec![
Entry {
entity: Address::Attribute(e.attribute.clone()),
attribute: ATTR_OF.to_string(),
attribute: ATTR_OF.parse().unwrap(),
value: EntryValue::Address(TYPE_URL_ADDRESS.clone()),
provenance: "SYSTEM EXTRACTOR".to_string(),
..Default::default()
provenance: context.provenance.clone() + "EXTRACTOR",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
},
e,
]
@ -149,7 +157,13 @@ mod test {
let address = Address::Url(Url::parse("https://upend.dev").unwrap());
assert!(WebExtractor.is_needed(&address, &connection)?);
WebExtractor.insert_info(&address, &connection, store, job_container)?;
WebExtractor.insert_info(
&address,
&connection,
store,
job_container,
OperationContext::default(),
)?;
assert!(!WebExtractor.is_needed(&address, &connection)?);

View File

@ -16,7 +16,7 @@ use std::collections::HashMap;
use std::net::SocketAddr;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::{Arc, Mutex};
use tracing::trace;
use tracing::{debug, error, info, warn};
use tracing_subscriber::filter::{EnvFilter, LevelFilter};
@ -26,7 +26,7 @@ use upend_base::hash::{sha256hash, UpMultihash};
use upend_db::jobs::JobContainer;
use upend_db::stores::fs::FsStore;
use upend_db::stores::UpStore;
use upend_db::{BlobMode, UpEndDatabase};
use upend_db::{BlobMode, OperationContext, UpEndDatabase};
use crate::util::exec::block_background;
@ -80,7 +80,7 @@ enum Commands {
entity: String,
/// The attribute of the entry.
attribute: String,
/// The value; its type will be heurestically determined.
/// The value; its type will be heuristically determined.
value: String,
/// Output format
#[arg(short, long, default_value = "tsv")]
@ -172,10 +172,6 @@ struct ServeArgs {
#[arg(long, env = "UPEND_SECRET")]
secret: Option<String>,
/// Authentication key users must supply.
#[arg(long, env = "UPEND_KEY")]
key: Option<String>,
/// Allowed host/domain name the API can serve.
#[arg(long, env = "UPEND_ALLOW_HOST")]
allow_host: Vec<String>,
@ -349,7 +345,7 @@ async fn main() -> Result<()> {
if !exists {
warn!(
"Couldn't locate Web UI directory ({:?}), disabling...",
WEBUI_PATH.to_owned()
*WEBUI_PATH
);
}
exists
@ -415,9 +411,9 @@ async fn main() -> Result<()> {
})),
desktop_enabled: !args.no_desktop,
trust_executables: args.trust_executables,
key: args.key,
secret,
},
public: Arc::new(Mutex::new(upend.connection()?.get_users()?.is_empty())),
};
// Start HTTP server
@ -451,13 +447,18 @@ async fn main() -> Result<()> {
block_background::<_, _, anyhow::Error>(move || {
let connection: upend_db::UpEndConnection = upend.connection()?;
let tree_mode = args.rescan_mode.unwrap_or_else(|| {
let tree_mode = if let Some(rescan_mode) = args.rescan_mode {
connection.set_vault_options(upend_db::VaultOptions {
blob_mode: Some(rescan_mode.clone()),
})?;
rescan_mode
} else {
connection
.get_vault_options()
.unwrap()
.blob_mode
.unwrap_or_default()
});
};
let _ = state.store.update(
&upend,
@ -466,8 +467,14 @@ async fn main() -> Result<()> {
initial: false,
tree_mode,
},
OperationContext::default(),
);
let _ = extractors::extract_all(
upend,
state.store,
job_container,
OperationContext::default(),
);
let _ = extractors::extract_all(upend, state.store, job_container);
Ok(())
});
}

View File

@ -1,10 +1,10 @@
use anyhow::anyhow;
use anyhow::Result;
use std::collections::HashMap;
use std::io::Read;
use std::path::Path;
use std::process::Command;
use anyhow::Result;
use tracing::{debug, trace};
use super::Previewable;

View File

@ -1,10 +1,10 @@
use anyhow::anyhow;
use anyhow::Result;
use std::collections::HashMap;
use std::io::Read;
use std::path::Path;
use std::process::Command;
use anyhow::Result;
use tracing::{debug, trace};
use super::Previewable;

View File

@ -26,7 +26,7 @@ use serde_json::json;
use std::collections::HashMap;
use std::convert::{TryFrom, TryInto};
use std::io::Write;
use std::sync::Arc;
use std::sync::{Arc, Mutex};
use std::time::{SystemTime, UNIX_EPOCH};
use tempfile::NamedTempFile;
use tracing::{debug, info, trace};
@ -41,12 +41,14 @@ use upend_db::jobs;
use upend_db::stores::UpdateOptions;
use upend_db::stores::{Blob, UpStore};
use upend_db::BlobMode;
use upend_db::OperationContext;
use upend_db::UpEndDatabase;
use upend_db::VaultOptions;
use url::Url;
#[cfg(feature = "desktop")]
use is_executable::IsExecutable;
use upend_base::error::UpEndError;
#[derive(Clone)]
pub struct State {
@ -56,69 +58,146 @@ pub struct State {
pub job_container: jobs::JobContainer,
pub preview_store: Option<Arc<PreviewStore>>,
pub preview_thread_pool: Option<Arc<rayon::ThreadPool>>,
pub public: Arc<Mutex<bool>>,
}
#[derive(Debug, Serialize, Deserialize)]
struct JwtClaims {
user: String,
exp: usize,
}
#[derive(Deserialize)]
pub struct LoginRequest {
key: String,
pub struct UserPayload {
username: String,
password: String,
}
#[derive(Deserialize)]
pub struct LoginQueryParams {
via: Option<String>,
}
#[post("/api/auth/login")]
pub async fn login(
state: web::Data<State>,
payload: web::Json<LoginRequest>,
payload: web::Json<UserPayload>,
query: web::Query<LoginQueryParams>,
) -> Result<HttpResponse, Error> {
if state.config.key.is_none() || Some(&payload.key) == state.config.key.as_ref() {
let claims = JwtClaims {
exp: (SystemTime::now()
.duration_since(UNIX_EPOCH)
.map_err(ErrorInternalServerError)?
.as_secs()
+ 7 * 24 * 60 * 60) as usize,
};
let conn = state.upend.connection().map_err(ErrorInternalServerError)?;
let token = jsonwebtoken::encode(
&jsonwebtoken::Header::default(),
&claims,
&jsonwebtoken::EncodingKey::from_secret(state.config.secret.as_ref()),
)
.map_err(ErrorInternalServerError)?;
Ok(HttpResponse::Ok().json(json!({ "token": token })))
} else {
Err(ErrorUnauthorized("Incorrect token."))
match conn.authenticate_user(&payload.username, &payload.password) {
Ok(()) => {
let token = create_token(&payload.username, &state.config.secret)?;
match query.via.as_deref() {
Some("cookie") => Ok(HttpResponse::NoContent()
.append_header((http::header::SET_COOKIE, format!("key={}; Path=/", token)))
.finish()),
_ => Ok(HttpResponse::Ok().json(json!({ "key": token }))),
}
}
Err(_) => Err(ErrorUnauthorized("Invalid credentials.")),
}
}
fn check_auth(req: &HttpRequest, state: &State) -> Result<(), actix_web::Error> {
if let Some(key) = &state.config.key {
if let Some(auth_header) = req.headers().get("Authorization") {
let auth_header = auth_header.to_str().map_err(|err| {
ErrorBadRequest(format!("Invalid value in Authorization header: {err:?}"))
})?;
#[post("/api/auth/logout")]
pub async fn logout() -> Result<HttpResponse, Error> {
Ok(HttpResponse::NoContent()
.append_header((http::header::SET_COOKIE, "key=; Path=/; Max-Age=0"))
.finish())
}
let token = jsonwebtoken::decode::<JwtClaims>(
auth_header,
&jsonwebtoken::DecodingKey::from_secret(key.as_ref()),
&jsonwebtoken::Validation::default(),
);
#[post("/api/auth/register")]
pub async fn register(
req: HttpRequest,
state: web::Data<State>,
payload: web::Json<UserPayload>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
token
.map(|_| ())
.map_err(|err| ErrorUnauthorized(format!("Invalid token: {err:?}")))
} else {
Err(ErrorUnauthorized("Authorization required."))
let conn = state.upend.connection().map_err(ErrorInternalServerError)?;
match conn.set_user(&payload.username, &payload.password) {
Ok(_) => {
*state.public.lock().unwrap() = false;
let token = create_token(&payload.username, &state.config.secret)?;
Ok(HttpResponse::Ok().json(json!({ "token": token })))
}
Err(e) => Err(ErrorInternalServerError(e)),
}
}
#[get("/api/auth/whoami")]
pub async fn whoami(req: HttpRequest, state: web::Data<State>) -> Result<HttpResponse, Error> {
let user = check_auth(&req, &state)?;
Ok(HttpResponse::Ok().json(json!({ "user": user })))
}
fn check_auth(req: &HttpRequest, state: &State) -> Result<Option<String>, actix_web::Error> {
if *state.public.lock().unwrap() {
return Ok(None);
}
let header_key = req.headers().get("Authorization").and_then(|value| {
value.to_str().ok().and_then(|value| {
if value.starts_with("Bearer ") {
Some(value.trim_start_matches("Bearer ").to_string())
} else {
None
}
})
});
let cookie_key = req.cookies().ok().and_then(|cookies| {
cookies
.iter()
.find(|c| c.name() == "key")
.map(|cookie| cookie.value().to_string())
});
let query_key = req.query_string().split('&').find_map(|pair| {
let parts = pair.split('=').collect::<Vec<&str>>();
match parts[..] {
["auth_key", value] => Some(value.to_string()),
_ => None,
}
});
let key = header_key.or(cookie_key).or(query_key);
if let Some(key) = key {
let token = jsonwebtoken::decode::<JwtClaims>(
&key,
&jsonwebtoken::DecodingKey::from_secret(state.config.secret.as_ref()),
&jsonwebtoken::Validation::default(),
);
match token {
Ok(token) => Ok(Some(token.claims.user)),
Err(err) => Err(ErrorUnauthorized(format!("Invalid token: {err:?}"))),
}
} else {
Ok(())
Err(ErrorUnauthorized("Authorization required."))
}
}
fn create_token(username: &str, secret: &str) -> Result<String, Error> {
let claims = JwtClaims {
user: username.to_string(),
exp: (SystemTime::now()
.duration_since(UNIX_EPOCH)
.map_err(ErrorInternalServerError)?
.as_secs()
+ 7 * 24 * 60 * 60) as usize,
};
jsonwebtoken::encode(
&jsonwebtoken::Header::default(),
&claims,
&jsonwebtoken::EncodingKey::from_secret(secret.as_ref()),
)
.map_err(ErrorInternalServerError)
}
#[derive(Deserialize)]
pub struct RawRequest {
native: Option<String>,
@ -127,10 +206,13 @@ pub struct RawRequest {
#[get("/api/raw/{hash}")]
pub async fn get_raw(
req: HttpRequest,
state: web::Data<State>,
web::Query(query): web::Query<RawRequest>,
hash: web::Path<String>,
) -> Result<impl Responder, Error> {
check_auth(&req, &state)?;
let address =
Address::decode(&b58_decode(hash.into_inner()).map_err(ErrorInternalServerError)?)
.map_err(ErrorInternalServerError)?;
@ -217,9 +299,12 @@ pub async fn get_raw(
#[head("/api/raw/{hash}")]
pub async fn head_raw(
req: HttpRequest,
state: web::Data<State>,
hash: web::Path<String>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let address =
Address::decode(&b58_decode(hash.into_inner()).map_err(ErrorInternalServerError)?)
.map_err(ErrorInternalServerError)?;
@ -253,10 +338,13 @@ pub async fn head_raw(
#[get("/api/thumb/{hash}")]
pub async fn get_thumbnail(
req: HttpRequest,
state: web::Data<State>,
hash: web::Path<String>,
web::Query(query): web::Query<HashMap<String, String>>,
) -> Result<Either<NamedFile, HttpResponse>, Error> {
check_auth(&req, &state)?;
#[cfg(feature = "previews")]
if let Some(preview_store) = &state.preview_store {
let hash = hash.into_inner();
@ -298,7 +386,13 @@ pub async fn get_thumbnail(
}
#[post("/api/query")]
pub async fn get_query(state: web::Data<State>, query: String) -> Result<HttpResponse, Error> {
pub async fn get_query(
req: HttpRequest,
state: web::Data<State>,
query: String,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let in_query: Query = query.parse().map_err(ErrorBadRequest)?;
@ -340,9 +434,12 @@ impl EntriesAsHash for Vec<Entry> {
#[get("/api/obj/{address_str}")]
pub async fn get_object(
req: HttpRequest,
state: web::Data<State>,
address: web::Path<Address>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let address = address.into_inner();
@ -384,6 +481,7 @@ pub struct InEntry {
pub value: EntryValue,
}
#[allow(clippy::large_enum_variant)]
#[derive(Debug, Clone, Deserialize)]
#[serde(untagged, deny_unknown_fields)]
pub enum PutInput {
@ -404,7 +502,7 @@ pub async fn put_object(
payload: web::Json<PutInput>,
web::Query(query): web::Query<UpdateQuery>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let user = check_auth(&req, &state)?;
let (entry_address, entity_address) = {
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
@ -413,11 +511,12 @@ pub async fn put_object(
debug!("PUTting {in_entry:?}");
let provenance = query.provenance.clone();
let _user = user.clone();
let process_inentry = move |in_entry: InEntry| -> Result<Entry> {
if let Some(entity) = in_entry.entity {
Ok(Entry {
entity: entity.try_into()?,
attribute: in_entry.attribute,
attribute: in_entry.attribute.parse()?,
value: in_entry.value,
provenance: (match &provenance {
Some(s) => format!("API {}", s),
@ -426,10 +525,11 @@ pub async fn put_object(
.trim()
.to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: _user.clone(),
})
} else {
Ok(Entry::try_from(&InvariantEntry {
attribute: in_entry.attribute,
attribute: in_entry.attribute.parse()?,
value: in_entry.value,
})?)
}
@ -467,21 +567,31 @@ pub async fn put_object(
let _address = address.clone();
let _job_container = state.job_container.clone();
let _store = state.store.clone();
let _user = user.clone();
block_background::<_, _, anyhow::Error>(move || {
let entry_count =
extractors::extract(&_address, &connection, _store, _job_container);
let entry_count = extractors::extract(
&_address,
&connection,
_store,
_job_container,
OperationContext {
user: _user,
provenance: "API".to_string(),
},
);
debug!("Added {entry_count} extracted entries for {_address:?}");
Ok(())
});
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let _user = user.clone();
web::block(move || {
connection.transaction::<_, anyhow::Error, _>(|| {
if connection.retrieve_object(&address)?.is_empty() {
connection.insert_entry(Entry {
entity: address.clone(),
attribute: ATTR_ADDED.to_string(),
attribute: ATTR_ADDED.parse().unwrap(),
value: EntryValue::Number(
SystemTime::now()
.duration_since(UNIX_EPOCH)
@ -494,6 +604,7 @@ pub async fn put_object(
})
.trim()
.to_string(),
user: _user,
timestamp: chrono::Utc::now().naive_utc(),
})?;
}
@ -516,7 +627,7 @@ pub async fn put_blob(
state: web::Data<State>,
mut payload: Multipart,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let user = check_auth(&req, &state)?;
if let Some(mut field) = payload.try_next().await? {
let mut file = NamedTempFile::new()?;
@ -555,6 +666,7 @@ pub async fn put_blob(
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let _store = state.store.clone();
let _filename = filename.clone();
let _user = user.clone();
let hash = web::block(move || {
let options = connection.get_vault_options()?;
_store
@ -563,6 +675,10 @@ pub async fn put_blob(
Blob::from_filepath(file.path()),
_filename,
options.blob_mode,
OperationContext {
user: _user,
provenance: "API".to_string(),
},
)
.map_err(anyhow::Error::from)
})
@ -588,8 +704,18 @@ pub async fn put_blob(
let _job_container = state.job_container.clone();
let _store = state.store.clone();
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let _user = user.clone();
block_background::<_, _, anyhow::Error>(move || {
let entry_count = extractors::extract(&_address, &connection, _store, _job_container);
let entry_count = extractors::extract(
&_address,
&connection,
_store,
_job_container,
OperationContext {
user: _user,
provenance: "API".to_string(),
},
);
debug!("Added {entry_count} extracted entries for {_address:?}");
Ok(())
});
@ -607,7 +733,7 @@ pub async fn put_object_attribute(
value: web::Json<EntryValue>,
web::Query(query): web::Query<UpdateQuery>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let user = check_auth(&req, &state)?;
let (address, attribute) = path.into_inner();
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
@ -622,7 +748,7 @@ pub async fn put_object_attribute(
let new_attr_entry = Entry {
entity: address,
attribute,
attribute: attribute.parse()?,
value: value.into_inner(),
provenance: (match &query.provenance {
Some(s) => format!("API {}", s),
@ -630,6 +756,7 @@ pub async fn put_object_attribute(
})
.trim()
.to_string(),
user: user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
};
@ -688,7 +815,14 @@ pub async fn get_address(
web::Query(query): web::Query<HashMap<String, String>>,
) -> Result<HttpResponse, Error> {
let (address, immutable) = if let Some(attribute) = query.get("attribute") {
(Address::Attribute(attribute.into()), true)
(
Address::Attribute(
attribute
.parse()
.map_err(|e: UpEndError| ErrorBadRequest(e.to_string()))?,
),
true,
)
} else if let Some(url) = query.get("url") {
(
Address::Url(Url::parse(url).map_err(ErrorBadRequest)?),
@ -727,7 +861,12 @@ pub async fn get_address(
}
#[get("/api/all/attributes")]
pub async fn get_all_attributes(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn get_all_attributes(
req: HttpRequest,
state: web::Data<State>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let attributes = web::block(move || connection.get_all_attributes())
.await?
@ -770,6 +909,8 @@ pub async fn list_hier(
path: web::Path<String>,
req: HttpRequest,
) -> Result<HttpResponse, Error> {
let user = check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
if path.is_empty() {
Ok(HttpResponse::MovedPermanently()
@ -780,9 +921,19 @@ pub async fn list_hier(
trace!(r#"Listing path "{}""#, upath);
let create = !req.method().is_safe();
let path = web::block(move || resolve_path(&connection, &upath, create))
.await?
.map_err(ErrorNotFound)?;
let path = web::block(move || {
resolve_path(
&connection,
&upath,
create,
OperationContext {
user,
provenance: "API".to_string(),
},
)
})
.await?
.map_err(ErrorNotFound)?;
match path.last() {
Some(addr) => Ok(HttpResponse::Found()
.append_header((http::header::LOCATION, format!("../../api/obj/{}", addr)))
@ -793,7 +944,11 @@ pub async fn list_hier(
}
#[get("/api/hier_roots")]
pub async fn list_hier_roots(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn list_hier_roots(
req: HttpRequest,
state: web::Data<State>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let result = web::block(move || {
@ -821,7 +976,7 @@ pub async fn api_refresh(
state: web::Data<State>,
web::Query(query): web::Query<RescanRequest>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let user = check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
@ -838,11 +993,19 @@ pub async fn api_refresh(
.unwrap_or_default(),
),
},
OperationContext {
user: user.clone(),
provenance: "API".to_string(),
},
);
let _ = crate::extractors::extract_all(
state.upend.clone(),
state.store.clone(),
state.job_container.clone(),
OperationContext {
user: user.clone(),
provenance: "API".to_string(),
},
);
Ok(())
});
@ -850,13 +1013,15 @@ pub async fn api_refresh(
}
#[get("/api/stats/vault")]
pub async fn vault_stats(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn vault_stats(req: HttpRequest, state: web::Data<State>) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
Ok(HttpResponse::Ok().json(connection.get_stats().map_err(ErrorInternalServerError)?))
}
#[get("/api/stats/store")]
pub async fn store_stats(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn store_stats(req: HttpRequest, state: web::Data<State>) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
Ok(HttpResponse::Ok().json(json!({
"main": state.store.stats().map_err(ErrorInternalServerError)?
})))
@ -869,9 +1034,11 @@ pub struct JobsRequest {
#[get("/api/jobs")]
pub async fn get_jobs(
req: HttpRequest,
state: web::Data<State>,
web::Query(query): web::Query<JobsRequest>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let jobs = state
.job_container
.get_jobs()
@ -898,12 +1065,14 @@ pub async fn get_info(state: web::Data<State>) -> Result<HttpResponse, Error> {
upend_db::common::build::PKG_VERSION,
build::PKG_VERSION
),
"desktop": state.config.desktop_enabled
"desktop": state.config.desktop_enabled,
"public": *state.public.lock().unwrap(),
})))
}
#[get("/api/options")]
pub async fn get_options(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn get_options(req: HttpRequest, state: web::Data<State>) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
Ok(HttpResponse::Ok().json(
connection
@ -931,7 +1100,11 @@ pub async fn put_options(
}
#[get("/api/migration/user-entries")]
pub async fn get_user_entries(state: web::Data<State>) -> Result<HttpResponse, Error> {
pub async fn get_user_entries(
req: HttpRequest,
state: web::Data<State>,
) -> Result<HttpResponse, Error> {
check_auth(&req, &state)?;
let connection = state.upend.connection().map_err(ErrorInternalServerError)?;
let result = web::block(move || connection.get_explicit_entries())
@ -1145,7 +1318,7 @@ mod tests {
assert_eq!(result["entity"]["t"], "Hash");
assert_eq!(result["entity"]["c"], digest_str);
let address = Address::Attribute("TEST".to_string());
let address = Address::Attribute("TEST".parse().unwrap());
let req = actix_web::test::TestRequest::get()
.uri(&format!("/api/obj/{}", address))
.to_request();
@ -1206,6 +1379,7 @@ mod tests {
initial: true,
tree_mode: upend_db::BlobMode::default(),
},
OperationContext::default(),
)
.unwrap();
@ -1217,11 +1391,11 @@ mod tests {
desktop_enabled: false,
trust_executables: false,
secret: "secret".to_string(),
key: None,
},
job_container,
preview_store: None,
preview_thread_pool: None,
public: Arc::new(Mutex::new(true)),
}
}
}

View File

@ -46,6 +46,9 @@ where
.app_data(actix_web::web::Data::new(state))
.wrap(actix_web::middleware::Logger::default().exclude("/api/jobs"))
.service(routes::login)
.service(routes::register)
.service(routes::logout)
.service(routes::whoami)
.service(routes::get_raw)
.service(routes::head_raw)
.service(routes::get_thumbnail)

View File

@ -26,13 +26,16 @@ once_cell = "1.7.2"
lru = "0.7.0"
diesel = { version = "1.4", features = [
"sqlite",
"r2d2",
"chrono",
"serde_json",
"sqlite",
"r2d2",
"chrono",
"serde_json",
] }
diesel_migrations = "1.4"
libsqlite3-sys = { version = "^0", features = ["bundled"] }
password-hash = "0.5.0"
argon2 = "0.5.3"
chrono = { version = "0.4", features = ["serde"] }
serde = { version = "1.0", features = ["derive"] }
@ -42,10 +45,10 @@ regex = "1"
multibase = "0.9"
multihash = { version = "*", default-features = false, features = [
"alloc",
"multihash-impl",
"sha2",
"identity",
"alloc",
"multihash-impl",
"sha2",
"identity",
] }
uuid = { version = "1.4", features = ["v4"] }
url = { version = "2", features = ["serde"] }

View File

@ -0,0 +1 @@
DROP TABLE users;

View File

@ -0,0 +1,7 @@
CREATE TABLE users
(
id INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
username VARCHAR NOT NULL,
password VARCHAR NOT NULL,
UNIQUE (username)
);

View File

@ -0,0 +1,2 @@
ALTER TABLE data
DROP COLUMN user;

View File

@ -0,0 +1,2 @@
ALTER TABLE data
ADD COLUMN user VARCHAR;

View File

@ -18,7 +18,8 @@ use diesel::{BoxableExpression, QueryDsl};
use diesel::{ExpressionMethods, TextExpressionMethods};
use upend_base::addressing::Address;
use upend_base::entry::{EntryPart, EntryValue};
use upend_base::lang::{Attribute, Query, QueryComponent, QueryPart, QueryQualifier};
use upend_base::error::UpEndError;
use upend_base::lang::{Query, QueryComponent, QueryPart, QueryQualifier};
#[derive(Debug, Clone)]
pub struct QueryExecutionError(String);
@ -31,6 +32,12 @@ impl std::fmt::Display for QueryExecutionError {
impl std::error::Error for QueryExecutionError {}
impl From<UpEndError> for QueryExecutionError {
fn from(e: UpEndError) -> Self {
QueryExecutionError(e.to_string())
}
}
pub fn execute(
connection: &PooledConnection<ConnectionManager<SqliteConnection>>,
query: Query,
@ -126,7 +133,7 @@ pub fn execute(
}
EntryPart::Attribute(a) => {
Some(EntryValue::Address(Address::Attribute(
a.0.clone(),
a.clone(),
)))
}
EntryPart::Value(v) => Some(v.clone()),
@ -171,10 +178,8 @@ pub fn execute(
var_name.clone(),
subquery_results
.iter()
.map(|e| {
EntryPart::Attribute(Attribute(e.attribute.clone()))
})
.collect(),
.map(|e| e.attribute.parse().map(EntryPart::Attribute))
.collect::<Result<Vec<EntryPart>, _>>()?,
);
}
@ -266,10 +271,10 @@ fn to_sqlite_predicates(query: Query) -> Result<SqlResult, QueryExecutionError>
match &eq.attribute {
QueryComponent::Exact(q_attribute) => {
subqueries.push(Box::new(data::attribute.eq(q_attribute.0.clone())))
subqueries.push(Box::new(data::attribute.eq(q_attribute.to_string())))
}
QueryComponent::In(q_attributes) => subqueries.push(Box::new(
data::attribute.eq_any(q_attributes.iter().map(|a| &a.0).cloned()),
data::attribute.eq_any(q_attributes.iter().map(|a| a.to_string())),
)),
QueryComponent::Contains(q_attribute) => subqueries
.push(Box::new(data::attribute.like(format!("%{}%", q_attribute)))),

View File

@ -1,35 +1,38 @@
use crate::inner::models;
use anyhow::{anyhow, Result};
use std::convert::TryFrom;
use upend_base::addressing::{Address, Addressable};
use upend_base::entry::{Entry, EntryValue, ImmutableEntry};
use upend_base::error::UpEndError;
impl TryFrom<&models::Entry> for Entry {
type Error = anyhow::Error;
type Error = UpEndError;
fn try_from(e: &models::Entry) -> Result<Self, Self::Error> {
if let Some(value_str) = &e.value_str {
Ok(Entry {
entity: Address::decode(&e.entity)?,
attribute: e.attribute.clone(),
value: value_str.parse()?,
attribute: e.attribute.parse()?,
value: value_str.parse().unwrap(),
provenance: e.provenance.clone(),
user: e.user.clone(),
timestamp: e.timestamp,
})
} else if let Some(value_num) = e.value_num {
Ok(Entry {
entity: Address::decode(&e.entity)?,
attribute: e.attribute.clone(),
attribute: e.attribute.parse()?,
value: EntryValue::Number(value_num),
provenance: e.provenance.clone(),
user: e.user.clone(),
timestamp: e.timestamp,
})
} else {
Ok(Entry {
entity: Address::decode(&e.entity)?,
attribute: e.attribute.clone(),
attribute: e.attribute.parse()?,
value: EntryValue::Number(f64::NAN),
provenance: e.provenance.clone(),
user: e.user.clone(),
timestamp: e.timestamp,
})
}
@ -40,22 +43,20 @@ impl TryFrom<&Entry> for models::Entry {
type Error = anyhow::Error;
fn try_from(e: &Entry) -> Result<Self, Self::Error> {
if e.attribute.is_empty() {
return Err(anyhow!("Attribute cannot be empty."));
}
let base_entry = models::Entry {
identity: e.address()?.encode()?,
entity_searchable: match &e.entity {
Address::Attribute(attr) => Some(attr.clone()),
Address::Attribute(attr) => Some(attr.to_string()),
Address::Url(url) => Some(url.to_string()),
_ => None,
},
entity: e.entity.encode()?,
attribute: e.attribute.clone(),
attribute: e.attribute.to_string(),
value_str: None,
value_num: None,
immutable: false,
provenance: e.provenance.clone(),
user: e.user.clone(),
timestamp: e.timestamp,
};

View File

@ -6,6 +6,7 @@ use lru::LruCache;
use tracing::trace;
use uuid::Uuid;
use crate::OperationContext;
use upend_base::addressing::Address;
use upend_base::constants::ATTR_LABEL;
use upend_base::constants::{ATTR_IN, HIER_ROOT_ADDR, HIER_ROOT_INVARIANT};
@ -74,7 +75,7 @@ pub fn list_roots(connection: &UpEndConnection) -> Result<Vec<Address>> {
Ok(connection
.query(Query::SingleQuery(QueryPart::Matches(PatternQuery {
entity: QueryComponent::Variable(None),
attribute: QueryComponent::Exact(ATTR_IN.into()),
attribute: QueryComponent::Exact(ATTR_IN.parse().unwrap()),
value: QueryComponent::Exact((*HIER_ROOT_ADDR).clone().into()),
})))?
.into_iter()
@ -91,6 +92,7 @@ pub fn fetch_or_create_dir(
parent: Option<Address>,
directory: UNode,
create: bool,
context: OperationContext,
) -> Result<Address> {
match parent.clone() {
Some(address) => trace!("FETCHING/CREATING {}/{:#}", address, directory),
@ -105,7 +107,7 @@ pub fn fetch_or_create_dir(
let matching_directories = connection
.query(Query::SingleQuery(QueryPart::Matches(PatternQuery {
entity: QueryComponent::Variable(None),
attribute: QueryComponent::Exact(ATTR_LABEL.into()),
attribute: QueryComponent::Exact(ATTR_LABEL.parse().unwrap()),
value: QueryComponent::Exact(directory.to_string().into()),
})))?
.into_iter()
@ -115,7 +117,7 @@ pub fn fetch_or_create_dir(
Some(parent) => connection
.query(Query::SingleQuery(QueryPart::Matches(PatternQuery {
entity: QueryComponent::Variable(None),
attribute: QueryComponent::Exact(ATTR_IN.into()),
attribute: QueryComponent::Exact(ATTR_IN.parse().unwrap()),
value: QueryComponent::Exact(parent.into()),
})))?
.into_iter()
@ -135,9 +137,10 @@ pub fn fetch_or_create_dir(
let directory_entry = Entry {
entity: new_directory_address.clone(),
attribute: String::from(ATTR_LABEL),
attribute: ATTR_LABEL.parse().unwrap(),
value: directory.to_string().into(),
provenance: "SYSTEM FS".to_string(),
provenance: context.provenance.clone() + "HIER",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
};
connection.insert_entry(directory_entry)?;
@ -145,17 +148,19 @@ pub fn fetch_or_create_dir(
connection.insert_entry(if let Some(parent) = parent {
Entry {
entity: new_directory_address.clone(),
attribute: String::from(ATTR_IN),
attribute: ATTR_IN.parse().unwrap(),
value: parent.into(),
provenance: "SYSTEM FS".to_string(),
provenance: context.provenance.clone() + "HIER",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}
} else {
Entry {
entity: new_directory_address.clone(),
attribute: String::from(ATTR_IN),
attribute: ATTR_IN.parse().unwrap(),
value: HIER_ROOT_ADDR.clone().into(),
provenance: "SYSTEM FS".to_string(),
provenance: context.provenance.clone() + "HIER",
user: context.user.clone(),
timestamp: chrono::Utc::now().naive_utc(),
}
})?;
@ -177,6 +182,7 @@ pub fn resolve_path(
connection: &UpEndConnection,
path: &UHierPath,
create: bool,
context: OperationContext,
) -> Result<Vec<Address>> {
let mut result: Vec<Address> = vec![];
let mut path_stack = path.0.to_vec();
@ -188,6 +194,7 @@ pub fn resolve_path(
result.last().cloned(),
path_stack.pop().unwrap(),
create,
context.clone(),
)?;
result.push(dir_address);
}
@ -201,6 +208,7 @@ pub fn resolve_path_cached(
connection: &UpEndConnection,
path: &UHierPath,
create: bool,
context: OperationContext,
cache: &Arc<Mutex<ResolveCache>>,
) -> Result<Vec<Address>> {
let mut result: Vec<Address> = vec![];
@ -216,7 +224,7 @@ pub fn resolve_path_cached(
result.push(address.clone());
} else {
drop(cache_lock);
let address = fetch_or_create_dir(connection, parent, node, create)?;
let address = fetch_or_create_dir(connection, parent, node, create, context.clone())?;
result.push(address.clone());
cache.lock().unwrap().put(key, address);
}
@ -286,11 +294,23 @@ mod tests {
let open_result = UpEndDatabase::open(&temp_dir, true).unwrap();
let connection = open_result.db.connection().unwrap();
let foo_result = fetch_or_create_dir(&connection, None, UNode("foo".to_string()), true);
let foo_result = fetch_or_create_dir(
&connection,
None,
UNode("foo".to_string()),
true,
OperationContext::default(),
);
assert!(foo_result.is_ok());
let foo_result = foo_result.unwrap();
let bar_result = fetch_or_create_dir(&connection, None, UNode("bar".to_string()), true);
let bar_result = fetch_or_create_dir(
&connection,
None,
UNode("bar".to_string()),
true,
OperationContext::default(),
);
assert!(bar_result.is_ok());
let bar_result = bar_result.unwrap();
@ -299,6 +319,7 @@ mod tests {
Some(bar_result.clone()),
UNode("baz".to_string()),
true,
OperationContext::default(),
);
assert!(baz_result.is_ok());
let baz_result = baz_result.unwrap();
@ -306,7 +327,12 @@ mod tests {
let roots = list_roots(&connection);
assert_eq!(roots.unwrap(), [foo_result, bar_result.clone()]);
let resolve_result = resolve_path(&connection, &"bar/baz".parse().unwrap(), false);
let resolve_result = resolve_path(
&connection,
&"bar/baz".parse().unwrap(),
false,
OperationContext::default(),
);
assert!(resolve_result.is_ok());
assert_eq!(
@ -314,10 +340,20 @@ mod tests {
vec![bar_result.clone(), baz_result.clone()]
);
let resolve_result = resolve_path(&connection, &"bar/baz/bax".parse().unwrap(), false);
let resolve_result = resolve_path(
&connection,
&"bar/baz/bax".parse().unwrap(),
false,
OperationContext::default(),
);
assert!(resolve_result.is_err());
let resolve_result = resolve_path(&connection, &"bar/baz/bax".parse().unwrap(), true);
let resolve_result = resolve_path(
&connection,
&"bar/baz/bax".parse().unwrap(),
true,
OperationContext::default(),
);
assert!(resolve_result.is_ok());
let bax_result = fetch_or_create_dir(
@ -325,6 +361,7 @@ mod tests {
Some(baz_result.clone()),
UNode("bax".to_string()),
false,
OperationContext::default(),
);
assert!(bax_result.is_ok());
let bax_result = bax_result.unwrap();

View File

@ -1,4 +1,4 @@
use super::schema::{data, meta};
use super::schema::{data, meta, users};
use chrono::NaiveDateTime;
use serde::Serialize;
@ -13,6 +13,7 @@ pub struct Entry {
pub value_num: Option<f64>,
pub immutable: bool,
pub provenance: String,
pub user: Option<String>,
pub timestamp: NaiveDateTime,
}
@ -23,3 +24,11 @@ pub struct MetaValue {
pub key: String,
pub value: String,
}
#[derive(Queryable, Insertable, Serialize, Clone, Debug)]
#[table_name = "users"]
pub struct UserValue {
pub id: i32,
pub username: String,
pub password: String,
}

View File

@ -8,6 +8,7 @@ table! {
value_num -> Nullable<Double>,
immutable -> Bool,
provenance -> Text,
user -> Nullable<Text>,
timestamp -> Timestamp,
}
}
@ -20,4 +21,10 @@ table! {
}
}
allow_tables_to_appear_in_same_query!(data, meta,);
table! {
users (id) {
id -> Integer,
username -> Text,
password -> Text,
}
}

View File

@ -26,6 +26,7 @@ use crate::inner::models;
use crate::inner::schema::data;
use crate::util::LoggerSink;
use anyhow::{anyhow, Result};
use argon2::{Argon2, PasswordHash, PasswordHasher, PasswordVerifier};
use diesel::prelude::*;
use diesel::r2d2::{self, ConnectionManager};
use diesel::result::{DatabaseErrorKind, Error};
@ -40,7 +41,7 @@ use std::sync::{Arc, Mutex, RwLock};
use std::time::Duration;
use tracing::{debug, error, trace, warn};
use upend_base::addressing::{Address, Addressable};
use upend_base::entry::{Entry, EntryValue, ImmutableEntry};
use upend_base::entry::{Attribute, Entry, EntryValue, ImmutableEntry};
use upend_base::error::UpEndError;
use upend_base::hash::UpMultihash;
use upend_base::lang::Query;
@ -116,7 +117,7 @@ pub const DATABASE_FILENAME: &str = "upend.sqlite3";
impl UpEndDatabase {
pub fn open<P: AsRef<Path>>(dirpath: P, reinitialize: bool) -> Result<OpenResult> {
embed_migrations!("./migrations/upend/");
embed_migrations!("./migrations/upend");
let upend_path = dirpath.as_ref().join(UPEND_SUBDIR);
@ -273,6 +274,63 @@ impl UpEndConnection {
Ok(VaultOptions { blob_mode })
}
pub fn get_users(&self) -> Result<Vec<String>> {
use crate::inner::schema::users::dsl;
let _lock = self.lock.read().unwrap();
let conn = self.pool.get()?;
let result = dsl::users.select(dsl::username).load::<String>(&conn)?;
Ok(result)
}
pub fn set_user(&self, username: &str, password: &str) -> Result<bool> {
use crate::inner::schema::users::dsl;
let salt = password_hash::SaltString::generate(&mut password_hash::rand_core::OsRng);
let argon2 = Argon2::default();
let hashed_password = argon2
.hash_password(password.as_ref(), &salt)
.map_err(|e| anyhow!(e))?
.to_string();
let _lock = self.lock.write().unwrap();
let conn = self.pool.get()?;
let result = diesel::replace_into(dsl::users)
.values((
dsl::username.eq(username),
dsl::password.eq(hashed_password),
))
.execute(&conn)?;
Ok(result > 0)
}
pub fn authenticate_user(&self, username: &str, password: &str) -> Result<()> {
use crate::inner::schema::users::dsl;
let conn = self.pool.get()?;
let user_result = dsl::users
.filter(dsl::username.eq(username))
.load::<models::UserValue>(&conn)?;
match user_result.first() {
Some(user) => {
let parsed_hash = PasswordHash::new(&user.password).map_err(|e| anyhow!(e))?;
let argon2 = Argon2::default();
argon2
.verify_password(password.as_ref(), &parsed_hash)
.map_err(|e| anyhow!(e))
}
None => {
let argon2 = Argon2::default();
let _ = argon2
.verify_password(password.as_ref(), &PasswordHash::new(&DUMMY_HASH).unwrap());
Err(anyhow!("user not found"))
}
}
}
pub fn retrieve_entry(&self, hash: &UpMultihash) -> Result<Option<Entry>> {
use crate::inner::schema::data::dsl::*;
@ -308,7 +366,7 @@ impl UpEndConnection {
let entries = primary
.iter()
.map(Entry::try_from)
.collect::<Result<Vec<Entry>>>()?;
.collect::<Result<Vec<Entry>, UpEndError>>()?;
let secondary = data
.filter(
@ -326,7 +384,7 @@ impl UpEndConnection {
let secondary_entries = secondary
.iter()
.map(Entry::try_from)
.collect::<Result<Vec<Entry>>>()?;
.collect::<Result<Vec<Entry>, UpEndError>>()?;
Ok([entries, secondary_entries].concat())
}
@ -414,7 +472,7 @@ impl UpEndConnection {
}
// #[deprecated]
pub fn get_all_attributes(&self) -> Result<Vec<String>> {
pub fn get_all_attributes(&self) -> Result<Vec<Attribute>> {
use crate::inner::schema::data::dsl::*;
let _lock = self.lock.read().unwrap();
@ -426,7 +484,10 @@ impl UpEndConnection {
.order_by(attribute)
.load::<String>(&conn)?;
Ok(result)
Ok(result
.into_iter()
.map(|a| a.parse())
.collect::<Result<Vec<Attribute>, UpEndError>>()?)
}
pub fn get_stats(&self) -> Result<serde_json::Value> {
@ -478,13 +539,23 @@ impl UpEndConnection {
)
.load(&conn)?;
result
Ok(result
.iter()
.map(Entry::try_from)
.collect::<Result<Vec<Entry>>>()
.collect::<Result<Vec<Entry>, UpEndError>>()?)
}
}
lazy_static! {
static ref DUMMY_HASH: String = Argon2::default()
.hash_password(
"password".as_ref(),
&password_hash::SaltString::generate(&mut password_hash::rand_core::OsRng)
)
.unwrap()
.to_string();
}
#[cfg(test)]
mod test {
use upend_base::constants::{ATTR_IN, ATTR_LABEL};
@ -599,6 +670,22 @@ mod test {
assert_eq!(result[0].entity, edge_entity);
assert_eq!(result[0].value, EntryValue::Address(random_entity));
}
#[test]
fn test_users() {
let tempdir = TempDir::new().unwrap();
let result = UpEndDatabase::open(&tempdir, false).unwrap();
let db = result.db;
let connection = db.connection().unwrap();
assert!(connection.authenticate_user("thm", "hunter2").is_err());
connection.set_user("thm", "hunter2").unwrap();
connection.authenticate_user("thm", "hunter2").unwrap();
assert!(connection.authenticate_user("thm", "password").is_err());
connection.set_user("thm", "password").unwrap();
connection.authenticate_user("thm", "password").unwrap();
}
}
#[derive(Debug, Serialize, Deserialize)]
@ -634,3 +721,18 @@ impl std::str::FromStr for BlobMode {
}
}
}
#[derive(Debug, Clone)]
pub struct OperationContext {
pub user: Option<String>,
pub provenance: String,
}
impl Default for OperationContext {
fn default() -> Self {
Self {
user: None,
provenance: "SYSTEM".to_string(),
}
}
}

View File

@ -3,9 +3,10 @@ macro_rules! upend_insert_val {
($db_connection:expr, $entity:expr, $attribute:expr, $value:expr) => {{
$db_connection.insert_entry(Entry {
entity: $entity.clone(),
attribute: String::from($attribute),
attribute: $attribute.parse().unwrap(),
value: upend_base::entry::EntryValue::String(String::from($value)),
provenance: "SYSTEM INIT".to_string(),
user: None,
timestamp: chrono::Utc::now().naive_utc(),
})
}};
@ -16,9 +17,10 @@ macro_rules! upend_insert_addr {
($db_connection:expr, $entity:expr, $attribute:expr, $addr:expr) => {{
$db_connection.insert_entry(Entry {
entity: $entity.clone(),
attribute: String::from($attribute),
attribute: $attribute.parse().unwrap(),
value: upend_base::entry::EntryValue::Address($addr.clone()),
provenance: "SYSTEM INIT".to_string(),
user: None,
timestamp: chrono::Utc::now().naive_utc(),
})
}};

View File

@ -5,7 +5,8 @@ use crate::hierarchies::{resolve_path, resolve_path_cached, ResolveCache, UHierP
use crate::jobs::{JobContainer, JobHandle};
use crate::util::hash_at_path;
use crate::{
BlobMode, ConnectionOptions, LoggingHandler, UpEndConnection, UpEndDatabase, UPEND_SUBDIR,
BlobMode, ConnectionOptions, LoggingHandler, OperationContext, UpEndConnection, UpEndDatabase,
UPEND_SUBDIR,
};
use anyhow::{anyhow, Result};
use chrono::prelude::*;
@ -95,6 +96,7 @@ impl FsStore {
db: D,
job_handle: JobHandle,
options: UpdateOptions,
context: OperationContext,
) -> Result<Vec<UpdatePathOutcome>> {
let start = Instant::now();
info!("Vault rescan started.");
@ -108,13 +110,13 @@ impl FsStore {
trace!("Initializing DB types.");
upend_insert_addr!(
upconnection,
Address::Attribute(FILE_SIZE_KEY.to_string()),
Address::Attribute(FILE_SIZE_KEY.parse().unwrap()),
ATTR_OF,
TYPE_HASH_ADDRESS
)?;
upend_insert_addr!(
upconnection,
Address::Attribute(FILE_MIME_KEY.to_string()),
Address::Attribute(FILE_MIME_KEY.parse().unwrap()),
ATTR_OF,
TYPE_HASH_ADDRESS
)?;
@ -136,6 +138,7 @@ impl FsStore {
// Actual processing
let count = RwLock::new(0_usize);
#[allow(clippy::type_complexity)]
let resolve_cache: Arc<Mutex<LruCache<(Option<Address>, UNode), Address>>> =
Arc::new(Mutex::new(LruCache::new(256)));
let total = paths.len() as f32;
@ -152,6 +155,7 @@ impl FsStore {
&existing_files,
&resolve_cache,
quick_check,
context.clone(),
);
let mut cnt = count.write().unwrap();
@ -238,6 +242,7 @@ impl FsStore {
Ok(all_outcomes)
}
#[allow(clippy::too_many_arguments)]
fn process_directory_entry<D: Borrow<UpEndDatabase>>(
&self,
db: D,
@ -247,6 +252,7 @@ impl FsStore {
existing_files: &Arc<RwLock<Vec<db::File>>>,
resolve_cache: &Arc<Mutex<ResolveCache>>,
quick_check: bool,
context: OperationContext,
) -> Result<UpdatePathOutcome> {
trace!("Processing: {:?}", path);
@ -364,6 +370,7 @@ impl FsStore {
size,
mtime,
Some(resolve_cache),
context,
)
.map(|_| {
info!("Added: {:?}", path);
@ -420,6 +427,7 @@ impl FsStore {
size: i64,
mtime: Option<NaiveDateTime>,
resolve_cache: Option<&Arc<Mutex<ResolveCache>>>,
context: OperationContext,
) -> Result<Address> {
let normalized_path = self.normalize_path(path)?;
let new_file = db::NewFile {
@ -438,24 +446,26 @@ impl FsStore {
// Metadata
let size_entry = Entry {
entity: blob_address.clone(),
attribute: FILE_SIZE_KEY.to_string(),
attribute: FILE_SIZE_KEY.parse().unwrap(),
value: (size as f64).into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
};
let mime_type = tree_magic_mini::from_filepath(path).map(|s| s.to_string());
let mime_entry = mime_type.map(|mime_type| Entry {
entity: blob_address.clone(),
attribute: FILE_MIME_KEY.to_string(),
attribute: FILE_MIME_KEY.parse().unwrap(),
value: mime_type.into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
});
let added_entry = Entry {
entity: blob_address.clone(),
attribute: ATTR_ADDED.to_string(),
attribute: ATTR_ADDED.parse().unwrap(),
value: (SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap()
@ -463,6 +473,7 @@ impl FsStore {
.into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
};
let components = normalized_path.components().collect::<Vec<Component>>();
@ -480,37 +491,42 @@ impl FsStore {
let label_entry = Entry {
entity: blob_address.clone(),
attribute: ATTR_LABEL.to_string(),
attribute: ATTR_LABEL.parse().unwrap(),
value: name
.unwrap_or_else(|| filename.as_os_str().to_string_lossy().to_string())
.into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
};
let label_entry_addr = connection.insert_entry(label_entry)?;
if let Some(upath) = upath {
let resolved_path = match resolve_cache {
Some(cache) => resolve_path_cached(connection, &upath, true, cache)?,
None => resolve_path(connection, &upath, true)?,
Some(cache) => {
resolve_path_cached(connection, &upath, true, context.clone(), cache)?
}
None => resolve_path(connection, &upath, true, context.clone())?,
};
let parent_dir = resolved_path.last().unwrap();
let dir_has_entry = Entry {
entity: blob_address.clone(),
attribute: ATTR_IN.to_string(),
attribute: ATTR_IN.parse().unwrap(),
value: parent_dir.clone().into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
};
let dir_has_entry_addr = connection.insert_entry(dir_has_entry)?;
let alias_entry = Entry {
entity: dir_has_entry_addr,
attribute: ATTR_BY.to_string(),
attribute: ATTR_BY.parse().unwrap(),
value: label_entry_addr.into(),
provenance: "SYSTEM INIT".to_string(),
timestamp: chrono::Utc::now().naive_utc(),
user: context.user.clone(),
};
connection.insert_entry(alias_entry)?;
}
@ -649,6 +665,7 @@ impl UpStore for FsStore {
blob: Blob,
name_hint: Option<String>,
blob_mode: Option<BlobMode>,
context: OperationContext,
) -> Result<UpMultihash, super::StoreError> {
let file_path = blob.get_file_path();
let hash = hash_at_path(file_path).map_err(|e| StoreError::Unknown(e.to_string()))?;
@ -702,6 +719,7 @@ impl UpStore for FsStore {
size,
mtime,
None,
context,
)
.map_err(|e| StoreError::Unknown(e.to_string()))?;
}
@ -714,6 +732,7 @@ impl UpStore for FsStore {
db: &UpEndDatabase,
mut job_container: JobContainer,
options: UpdateOptions,
context: OperationContext,
) -> Result<Vec<UpdatePathOutcome>, StoreError> {
trace!(
"Running a vault update of {:?}, options = {:?}.",
@ -724,7 +743,7 @@ impl UpStore for FsStore {
match job_result {
Ok(job_handle) => {
let result = self.rescan_vault(db, job_handle, options);
let result = self.rescan_vault(db, job_handle, options, context);
if let Err(err) = &result {
error!("Update did not succeed! {:?}", err);
@ -834,6 +853,7 @@ mod test {
initial: true,
tree_mode: BlobMode::default(),
},
OperationContext::default(),
);
assert!(rescan_result.is_ok());
}
@ -880,6 +900,7 @@ mod test {
initial: quick,
tree_mode: BlobMode::default(),
},
OperationContext::default(),
);
assert!(rescan_result.is_ok());
@ -900,6 +921,7 @@ mod test {
initial: quick,
tree_mode: BlobMode::default(),
},
OperationContext::default(),
);
assert!(rescan_result.is_ok());
@ -923,6 +945,7 @@ mod test {
initial: quick,
tree_mode: BlobMode::default(),
},
OperationContext::default(),
);
assert!(rescan_result.is_ok());
@ -975,6 +998,7 @@ mod test {
initial: quick,
tree_mode: BlobMode::default(),
},
OperationContext::default(),
);
assert!(rescan_result.is_ok());
@ -1077,6 +1101,7 @@ mod test {
initial: true,
tree_mode,
},
OperationContext::default(),
)
.unwrap();
@ -1087,7 +1112,7 @@ mod test {
paths.iter().for_each(|path| {
let upath: UHierPath = path.parse().unwrap();
assert!(
resolve_path(&connection, &upath, false).is_ok(),
resolve_path(&connection, &upath, false, OperationContext::default()).is_ok(),
"Failed: {}",
upath
);

View File

@ -1,6 +1,7 @@
use std::path::{Path, PathBuf};
use super::{UpEndConnection, UpEndDatabase};
use crate::OperationContext;
use crate::{jobs::JobContainer, BlobMode};
use upend_base::hash::UpMultihash;
@ -61,12 +62,14 @@ pub trait UpStore {
blob: Blob,
name_hint: Option<String>,
blob_mode: Option<BlobMode>,
context: OperationContext,
) -> Result<UpMultihash>;
fn update(
&self,
database: &UpEndDatabase,
job_container: JobContainer,
options: UpdateOptions,
context: OperationContext,
) -> Result<Vec<UpdatePathOutcome>>;
fn stats(&self) -> Result<serde_json::Value>;
}

3
sdks/js/.eslintignore Normal file
View File

@ -0,0 +1,3 @@
node_modules
/dist
tests

View File

@ -1,3 +1,2 @@
node_modules
*.wasm
*.js
dist

5
sdks/js/jest.config.js Normal file
View File

@ -0,0 +1,5 @@
/** @type {import('ts-jest').JestConfigWithTsJest} */
module.exports = {
preset: 'ts-jest',
testEnvironment: 'node',
};

View File

@ -1,26 +1,34 @@
{
"name": "@upnd/upend",
"version": "0.3.3",
"version": "0.5.5",
"description": "Client library to interact with the UpEnd system.",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"files": [
"dist/**/*"
],
"exports": {
".": "./dist/index.js",
"./*": "./dist/*.js",
"./wasm": "./dist/wasm/index.js",
"./wasm/*": "./dist/wasm/*.js"
},
"scripts": {
"build": "tsc --build --verbose",
"test": "pnpm build && ava",
"test": "jest",
"lint": "eslint ."
},
"author": "Tomáš Mládek <t@mldk.cz>",
"license": "AGPL-3.0",
"files": [
"**/*.ts",
"**/*.js",
"**/*.wasm"
],
"devDependencies": {
"@types/debug": "^4.1.8",
"@types/jest": "^29.5.12",
"@typescript-eslint/eslint-plugin": "latest",
"@typescript-eslint/parser": "latest",
"ava": "^3.15.0",
"eslint": "^8.7.0",
"eslint-plugin-ava": "^14.0.0",
"jest": "^29.7.0",
"ts-jest": "^29.1.2",
"typescript": "^4.4.4"
},
"dependencies": {

File diff suppressed because it is too large Load Diff

View File

@ -1,9 +1,9 @@
import LRU from "lru-cache";
import type { Query, UpObject } from ".";
import { UpListing } from ".";
import type { Query, UpObject } from "./index";
import { UpListing } from "./index";
import type {
ADDRESS_TYPE,
Address,
ADDRESS_TYPE,
AttributeListingResult,
EntityListing,
IJob,
@ -14,12 +14,20 @@ import type {
StoreInfo,
VaultInfo,
} from "./types";
import type { UpEndWasmExtensions, AddressComponents } from "./wasm";
import type { AddressComponents, UpEndWasmExtensions } from "./wasm";
import debug from "debug";
import { browser } from "./util";
const dbg = debug("upend:api");
export type { AddressComponents };
export type UpendApiError = {
kind: "Unauthorized" | "HttpError" | "FetchError" | "Unknown";
message?: string;
error?: Error;
};
export class UpEndApi {
private instanceUrl = "";
private readonly wasmExtensions: UpEndWasmExtensions | undefined = undefined;
@ -27,15 +35,21 @@ export class UpEndApi {
private queryOnceLRU = new LRU<string, UpListing>({ max: 128 });
private inFlightRequests: { [key: string]: Promise<UpListing> | null } = {};
private key: string | undefined;
private readonly onError: ((error: UpendApiError) => void) | undefined;
constructor(config: {
constructor(config?: {
instanceUrl?: string;
wasmExtensions?: UpEndWasmExtensions;
timeout?: number;
authKey?: string;
onError?: (error: UpendApiError) => void;
}) {
this.setInstanceUrl(config.instanceUrl || "http://localhost:8093");
this.wasmExtensions = config.wasmExtensions;
this.timeout = config.timeout || 30_000;
this.setInstanceUrl(config?.instanceUrl || "http://localhost:8093");
this.wasmExtensions = config?.wasmExtensions;
this.timeout = config?.timeout || 30_000;
this.key = config?.authKey;
this.onError = config?.onError;
}
public setInstanceUrl(apiUrl: string) {
@ -51,10 +65,10 @@ export class UpEndApi {
options?: ApiFetchOptions,
): Promise<UpObject> {
dbg("Fetching Entity %s", address);
const signal = this.getAbortSignal(options);
const entityFetch = await fetch(`${this.apiUrl}/obj/${address}`, {
signal,
});
const entityFetch = await this.fetch(
`${this.apiUrl}/obj/${address}`,
options,
);
const entityResult = (await entityFetch.json()) as EntityListing;
const entityListing = new UpListing(entityResult.entries);
return entityListing.getObject(address);
@ -62,8 +76,7 @@ export class UpEndApi {
public async fetchEntry(address: string, options?: ApiFetchOptions) {
dbg("Fetching entry %s", address);
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/raw/${address}`, { signal });
const response = await this.fetch(`${this.apiUrl}/raw/${address}`, options);
const data = await response.json();
const listing = new UpListing({ address: data });
return listing.entries[0];
@ -80,12 +93,10 @@ export class UpEndApi {
if (!this.inFlightRequests[queryStr]) {
dbg(`Querying: ${query}`);
this.inFlightRequests[queryStr] = new Promise((resolve, reject) => {
const signal = this.getAbortSignal(options);
fetch(`${this.apiUrl}/query`, {
this.fetch(`${this.apiUrl}/query`, options, {
method: "POST",
body: queryStr,
keepalive: true,
signal,
})
.then(async (response) => {
if (!response.ok) {
@ -115,12 +126,10 @@ export class UpEndApi {
options?: ApiFetchOptions,
): Promise<PutResult> {
dbg("Putting %O", input);
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/obj`, {
method: "PUT",
const response = await this.fetch(`${this.apiUrl}/obj`, options, {
headers: { "Content-Type": "application/json" },
method: "PUT",
body: JSON.stringify(input),
signal,
});
return await response.json();
@ -139,12 +148,10 @@ export class UpEndApi {
url += `?provenance=${provenance}`;
}
const signal = this.getAbortSignal(options);
const response = await fetch(url, {
const response = await this.fetch(url, options, {
method: "PUT",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(value),
signal,
});
return await response.json();
@ -152,28 +159,66 @@ export class UpEndApi {
public async putBlob(
fileOrUrl: File | URL,
options?: ApiFetchOptions,
): Promise<PutResult> {
options?: ApiFetchOptions & { onProgress?: (ev: ProgressEvent) => void },
): Promise<Address> {
dbg("Putting Blob: %O", fileOrUrl);
const formData = new FormData();
if (fileOrUrl instanceof File) {
formData.append(fileOrUrl.name, fileOrUrl);
} else {
formData.append("@url", fileOrUrl.toString());
}
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/blob`, {
method: "PUT",
body: formData,
signal,
});
if (!response.ok) {
throw Error(await response.text());
if (browser && fileOrUrl instanceof File) {
dbg("Using XHR for file upload");
const xhrdbg = debug("upend:api:xhr");
const xhr = new XMLHttpRequest();
signal.addEventListener("abort", () => xhr.abort());
for (const event of [
"loadstart",
"load",
"loadend",
"progress",
"abort",
"error",
] as const) {
xhr.addEventListener(event, (ev) => xhrdbg(`XHR ${event}: %O`, ev));
xhr.upload.addEventListener(event, (ev) =>
xhrdbg(`XHR upload ${event}: %O`, ev),
);
if (options?.onProgress) {
xhr.upload.addEventListener(event, options.onProgress);
}
}
return new Promise((resolve, reject) => {
xhr.open("PUT", `${this.apiUrl}/blob`, true);
xhr.onload = () => {
if (xhr.status >= 200 && xhr.status < 300) {
try {
resolve(JSON.parse(xhr.responseText));
} catch (e) {
reject(e);
}
} else {
reject(xhr.statusText);
}
};
xhr.send(formData);
});
} else {
const response = await this.fetch(`${this.apiUrl}/blob`, options, {
method: "PUT",
body: formData,
});
if (!response.ok) {
throw Error(await response.text());
}
return await response.json();
}
return await response.json();
}
public async deleteEntry(
@ -181,12 +226,20 @@ export class UpEndApi {
options?: ApiFetchOptions,
): Promise<void> {
dbg("Deleting entry %s", address);
const signal = this.getAbortSignal(options);
await fetch(`${this.apiUrl}/obj/${address}`, { method: "DELETE", signal });
await this.fetch(`${this.apiUrl}/obj/${address}`, options, {
method: "DELETE",
});
}
public getRaw(address: Address, preview = false) {
return `${this.apiUrl}/${preview ? "thumb" : "raw"}/${address}`;
public getRaw(
address: Address,
config?: { preview?: boolean; authenticated?: boolean },
) {
let result = `${this.apiUrl}/${config?.preview ? "thumb" : "raw"}/${address}`;
if (config?.authenticated) {
result += `?auth_key=${this.key}`;
}
return result;
}
public async fetchRaw(
@ -195,26 +248,24 @@ export class UpEndApi {
options?: ApiFetchOptions,
) {
dbg("Getting %s raw (preview = %s)", address, preview);
const signal = this.getAbortSignal(options);
return await fetch(this.getRaw(address, preview), { signal });
return await this.fetch(this.getRaw(address, { preview }), options);
}
public async refreshVault(options?: ApiFetchOptions) {
dbg("Triggering vault refresh");
const signal = this.getAbortSignal(options);
return await fetch(`${this.apiUrl}/refresh`, { method: "POST", signal });
return await this.fetch(`${this.apiUrl}/refresh`, options, {
method: "POST",
});
}
public async nativeOpen(address: Address, options?: ApiFetchOptions) {
dbg("Opening %s natively", address);
const signal = this.getAbortSignal(options);
return fetch(`${this.apiUrl}/raw/${address}?native=1`, { signal });
return this.fetch(`${this.apiUrl}/raw/${address}?native=1`, options);
}
public async fetchRoots(options?: ApiFetchOptions): Promise<ListingResult> {
dbg("Fetching hierarchical roots...");
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/hier_roots`, { signal });
const response = await this.fetch(`${this.apiUrl}/hier_roots`, options);
const roots = await response.json();
dbg("Hierarchical roots: %O", roots);
return roots;
@ -222,8 +273,7 @@ export class UpEndApi {
public async fetchJobs(options?: ApiFetchOptions): Promise<IJob[]> {
// dbg("Fetching jobs...");
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/jobs`, { signal });
const response = await this.fetch(`${this.apiUrl}/jobs`, options);
return await response.json();
}
@ -231,8 +281,7 @@ export class UpEndApi {
options?: ApiFetchOptions,
): Promise<AttributeListingResult> {
dbg("Fetching all attributes...");
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/all/attributes`, { signal });
const response = await this.fetch(`${this.apiUrl}/all/attributes`, options);
const result = await response.json();
dbg("All attributes: %O", result);
return await result;
@ -240,19 +289,25 @@ export class UpEndApi {
public async fetchInfo(options?: ApiFetchOptions): Promise<VaultInfo> {
dbg("Fetching vault info...");
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/info`, { signal });
const response = await this.fetch(`${this.apiUrl}/info`, options);
const result = await response.json();
dbg("Vault info: %O", result);
return result;
}
public async fetchOptions(options?: ApiFetchOptions): Promise<VaultOptions> {
dbg("Fetching vault options...");
const response = await this.fetch(`${this.apiUrl}/options`, options);
const result = await response.json();
dbg("Vault options: %O", result);
return result;
}
public async fetchStoreInfo(
options?: ApiFetchOptions,
): Promise<{ [key: string]: StoreInfo }> {
dbg("Fetching store info...");
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/stats/store`, { signal });
const response = await this.fetch(`${this.apiUrl}/stats/store`, options);
const result = await response.json();
dbg("Store info: %O");
return await result;
@ -268,16 +323,15 @@ export class UpEndApi {
await this.wasmExtensions.init();
return this.wasmExtensions.AddressTypeConstants[input];
}
const signal = this.getAbortSignal(options);
response = await fetch(`${this.apiUrl}/address?type=${input}`, {
signal,
});
response = await this.fetch(
`${this.apiUrl}/address?type=${input}`,
options,
);
} else {
if ("urlContent" in input) {
const signal = this.getAbortSignal(options);
response = await fetch(
response = await this.fetch(
`${this.apiUrl}/address?url_content=${input.urlContent}`,
{ signal },
options,
);
} else {
throw new Error("Input cannot be empty.");
@ -311,8 +365,7 @@ export class UpEndApi {
public async getVaultOptions(
options?: ApiFetchOptions,
): Promise<VaultOptions> {
const signal = this.getAbortSignal(options);
const response = await fetch(`${this.apiUrl}/options`, { signal });
const response = await this.fetch(`${this.apiUrl}/options`, options);
return await response.json();
}
@ -328,12 +381,10 @@ export class UpEndApi {
payload["blob_mode"] = blob_mode;
}
const signal = this.getAbortSignal(apiOptions);
const response = await fetch(`${this.apiUrl}/options`, {
const response = await this.fetch(`${this.apiUrl}/options`, apiOptions, {
method: "PUT",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
signal,
});
if (!response.ok) {
@ -341,14 +392,145 @@ export class UpEndApi {
}
}
public async authenticate(
credentials: {
username: string;
password: string;
},
mode: "key",
options?: ApiFetchOptions,
): Promise<{ key: string }>;
public async authenticate(
credentials: {
username: string;
password: string;
},
mode?: "cookie",
options?: ApiFetchOptions,
): Promise<void>;
public async authenticate(
credentials: {
username: string;
password: string;
},
mode: "key" | "cookie" | undefined,
options?: ApiFetchOptions,
): Promise<{ key: string } | void> {
const via = mode || "cookie";
const response = await this.fetch(
`${this.apiUrl}/auth/login?via=${via}`,
options,
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(credentials),
},
);
if (!response.ok) {
throw Error(await response.text());
}
if (mode === "key") {
const data = await response.json();
if (!data.key) {
throw Error("No key returned from server.");
}
this.key = data.key;
return data.key;
}
}
public async register(credentials: {
username: string;
password: string;
}): Promise<void> {
await this.fetch(`${this.apiUrl}/auth/register`, undefined, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(credentials),
});
}
public async authStatus(
options?: ApiFetchOptions,
): Promise<{ user: string } | undefined> {
const response = await this.fetch(`${this.apiUrl}/auth/whoami`, options);
return await response.json();
}
public async resetAuth(mode: "key"): Promise<void>;
public async resetAuth(
mode?: "cookie",
options?: ApiFetchOptions,
): Promise<void>;
public async resetAuth(
mode?: "key" | "cookie",
options?: ApiFetchOptions,
): Promise<void> {
if (mode === "key") {
this.key = undefined;
} else {
await this.fetch(`${this.apiUrl}/auth/logout`, options, {
method: "POST",
});
}
}
private getAbortSignal(options: ApiFetchOptions | undefined) {
const controller = options?.abortController || new AbortController();
const timeout = options?.timeout || this.timeout;
if (timeout > 0) {
setTimeout(() => controller.abort(), timeout);
setTimeout(() => {
dbg("Aborting request after %d ms", timeout);
controller.abort();
}, timeout);
}
return controller.signal;
}
private async fetch(
url: string,
options: ApiFetchOptions | undefined,
requestInit?: RequestInit & { headers?: Record<string, string> },
): Promise<Response> {
const signal = this.getAbortSignal(options);
const headers = requestInit?.headers || {};
if (this.key) {
headers["Authorization"] = `Bearer ${this.key}`;
}
let result: Response;
let error: UpendApiError | undefined;
try {
result = await fetch(url, {
...requestInit,
signal,
headers,
});
if (!result.ok) {
if (result.status === 401) {
error = { kind: "Unauthorized", message: await result.text() };
} else {
error = {
kind: "HttpError",
message: `HTTP Error ${result.status}: ${result.statusText}`,
};
}
}
} catch (e) {
error = { kind: "FetchError", error: e as Error };
}
if (error) {
if (this.onError) {
this.onError(error);
}
throw error;
}
return result!;
}
}
export interface ApiFetchOptions {
@ -357,6 +539,7 @@ export interface ApiFetchOptions {
}
export type VaultBlobMode = "Flat" | "Mirror" | "Incoming";
export interface VaultOptions {
blob_mode: VaultBlobMode;
}

View File

@ -121,6 +121,7 @@ export class UpEntry extends UpObject implements IEntry {
attribute: string;
value: IValue;
provenance: string;
user: string;
timestamp: string;
constructor(address: string, entry: IEntry, listing: UpListing) {
@ -130,6 +131,7 @@ export class UpEntry extends UpObject implements IEntry {
this.attribute = entry.attribute;
this.value = entry.value;
this.provenance = entry.provenance;
this.user = entry.user;
this.timestamp = entry.timestamp;
}

View File

@ -18,6 +18,8 @@ export interface IEntry {
value: IValue;
/** The origin or provenance of the data entry (e.g. SYSTEM or USER API...) */
provenance: string;
/** The user who created the data entry. */
user: string;
/** The timestamp when the data entry was created in RFC 3339 format. */
timestamp: string;
}
@ -97,6 +99,7 @@ export interface VaultInfo {
location: string;
version: string;
desktop: boolean;
public: boolean;
}
export interface StoreInfo {

1
sdks/js/src/util.ts Normal file
View File

@ -0,0 +1 @@
export const browser = typeof window !== "undefined";

View File

@ -1,5 +1,5 @@
import type { AddressComponents, AddressTypeConstants } from ".";
import { UpEndWasmExtensions } from ".";
import type { AddressComponents, AddressTypeConstants } from "./index";
import { UpEndWasmExtensions } from "./index";
import {
addr_to_components,
components_to_addr,

View File

@ -1,5 +1,5 @@
import type { AddressComponents, AddressTypeConstants } from ".";
import { UpEndWasmExtensions } from ".";
import type { AddressComponents, AddressTypeConstants } from "./index";
import { UpEndWasmExtensions } from "./index";
import type { InitInput } from "@upnd/wasm-web";
import {
addr_to_components,

View File

@ -0,0 +1,79 @@
import { Any, Query, Variable } from "../src/query";
describe("query matches", () => {
test("query matches simple", () => {
const query = Query.matches("entity", "attribute", "value");
expect(query.toString()).toBe('(matches entity "attribute" "value")');
});
test("query matches anything", () => {
const query = Query.matches(Any, Any, Any);
expect(query.toString()).toBe("(matches ? ? ?)");
});
test("query matches array", () => {
const query = Query.matches("entity", "attribute", ["value1", "value2"]);
expect(query.toString()).toBe(
'(matches entity "attribute" (in "value1" "value2"))',
);
});
test("query matches addresses", () => {
const query = Query.matches("entity", "attribute", [
"@address1",
"@address2",
]);
expect(query.toString()).toBe(
'(matches entity "attribute" (in @address1 @address2))',
);
});
test("query matches numbers", () => {
const query = Query.matches("entity", "attribute", [1, 2]);
expect(query.toString()).toBe('(matches entity "attribute" (in 1 2))');
});
test("query matches variables", () => {
const query = Query.matches("entity", "attribute", Variable("a"));
expect(query.toString()).toBe('(matches entity "attribute" ?a)');
});
});
describe("compound queries", () => {
test("OR queries", () => {
const query = Query.or(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2"),
);
expect(query.toString()).toBe(
'(or (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))',
);
});
test("AND queries", () => {
const query = Query.and(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2"),
);
expect(query.toString()).toBe(
'(and (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))',
);
});
test("NOT query", () => {
const query = Query.not(Query.matches("entity", "attribute1", "value2"));
expect(query.toString()).toBe(
'(not (matches entity "attribute1" "value2"))',
);
});
test("JOIN queries", () => {
const query = Query.join(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2"),
);
expect(query.toString()).toBe(
'(join (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))',
);
});
});

19
sdks/js/tsconfig.json Normal file
View File

@ -0,0 +1,19 @@
{
"compilerOptions": {
"declaration": true,
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"importsNotUsedAsValues": "error",
"lib": [
"es2019",
"DOM"
],
"module": "commonjs",
"outDir": "dist",
"rootDir": "src",
"strict": true,
"target": "es5"
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}

1
tools/.gitignore vendored
View File

@ -1 +0,0 @@
*/dist

View File

@ -1,134 +0,0 @@
import hashlib
import logging
from dataclasses import dataclass
from operator import add
import click
import colorama
import psycopg2
from tqdm import tqdm
from upend import UpEnd
class LogFormatter(logging.Formatter):
format_str = "[%(asctime)s] %(levelname)s - %(message)s"
FORMATS = {
logging.DEBUG: colorama.Fore.LIGHTBLACK_EX + format_str + colorama.Fore.RESET,
logging.INFO: format_str,
logging.WARNING: colorama.Fore.YELLOW + format_str + colorama.Fore.RESET,
logging.ERROR: colorama.Fore.RED + format_str + colorama.Fore.RESET,
logging.CRITICAL: colorama.Fore.RED
+ colorama.Style.BRIGHT
+ format_str
+ colorama.Style.RESET_ALL
+ colorama.Fore.RESET,
}
def format(self, record):
log_fmt = self.FORMATS.get(record.levelno)
formatter = logging.Formatter(log_fmt)
return formatter.format(record)
@dataclass
class KSXTrackFile:
file: str
sha256sum: str
energy: int
seriousness: int
tint: int
materials: int
@click.command()
@click.option("--db-name", required=True)
@click.option("--db-user", required=True)
@click.option("--db-password", required=True)
@click.option("--db-host", default="localhost")
@click.option("--db-port", default=5432, type=int)
def main(db_name, db_user, db_password, db_host, db_port):
"""Load KSX database dump into UpEnd."""
logger = logging.getLogger("ksx2upend")
logger.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
ch.setFormatter(LogFormatter())
logger.addHandler(ch)
logger.debug("Connecting to PostgreSQL...")
connection = psycopg2.connect(
database=db_name,
user=db_user,
password=db_password,
host=db_host,
port=db_port,
)
cur = connection.cursor()
logger.debug("Connecting to UpEnd...")
upend = UpEnd()
cur.execute(
"SELECT file, sha256sum, energy, seriousness, tint, materials "
"FROM ksx_radio_trackfile "
"INNER JOIN ksx_radio_moodsregular ON ksx_radio_trackfile.track_id = ksx_radio_moodsregular.track_id"
)
trackfiles = [KSXTrackFile(*row) for row in cur.fetchall()]
logger.info(f"Got {len(trackfiles)} (annotated) trackfiles from database...")
# TODO: get_invariant() or somesuch?
blob_addr = list(upend.query((None, "TYPE", 'J"BLOB"')).values())[0]["entity"]
all_files = upend.query((None, "IS", f"O{blob_addr}")).values()
hashed_files = upend.query((None, "SHA256", None)).values()
logger.info(
f"Got {len(all_files)} files from UpEnd ({len(hashed_files)} of which are hashed)..."
)
if len(hashed_files) < len(all_files):
logger.info("Computing SHA256 hashes for UpEnd files...")
hashed_entries = [entry["entity"] for entry in hashed_files]
unhashed_files = [
file for file in all_files if file["entity"] not in hashed_entries
]
for entry in tqdm(unhashed_files):
sha256_hash = hashlib.sha256()
for chunk in upend.get_raw(entry["entity"]):
sha256_hash.update(chunk)
upend.insert((entry["entity"], "SHA256", sha256_hash.hexdigest()))
hashed_files = upend.query((None, "SHA256", None)).values()
sha256_trackfiles = {tf.sha256sum: tf for tf in trackfiles}
sha256_entities = {entry["value"]["c"]: entry["entity"] for entry in hashed_files}
tf_and_ue = [sum for sum in sha256_trackfiles.keys() if sum in sha256_entities]
logger.info(
f"Out of {len(trackfiles)} trackfiles, and out of {len(hashed_files)} files in UpEnd, {len(tf_and_ue)} are present in both."
)
logger.info("Inserting types...")
ksx_type_result = upend.insert((None, "TYPE", "KSX_TRACK_MOODS"))
ksx_type_addr = list(ksx_type_result.values())[0]["entity"]
upend.insert((ksx_type_addr, "TYPE_REQUIRES", "KSX_ENERGY"))
upend.insert((ksx_type_addr, "TYPE_REQUIRES", "KSX_SERIOUSNESS"))
upend.insert((ksx_type_addr, "TYPE_REQUIRES", "KSX_TINT"))
upend.insert((ksx_type_addr, "TYPE_REQUIRES", "KSX_MATERIALS"))
logger.info("Inserting mood data...")
for sum in tqdm(tf_and_ue):
tf = sha256_trackfiles[sum]
address = sha256_entities[sum]
upend.insert((address, "IS", ksx_type_addr), value_type="Address")
upend.insert((address, "KSX_ENERGY", tf.energy))
upend.insert((address, "KSX_SERIOUSNESS", tf.seriousness))
upend.insert((address, "KSX_TINT", tf.tint))
upend.insert((address, "KSX_MATERIALS", tf.materials))
if __name__ == "__main__":
main()

View File

@ -1,398 +0,0 @@
[[package]]
name = "appdirs"
version = "1.4.4"
description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
category = "dev"
optional = false
python-versions = "*"
[[package]]
name = "black"
version = "21.6b0"
description = "The uncompromising code formatter."
category = "dev"
optional = false
python-versions = ">=3.6.2"
[package.dependencies]
appdirs = "*"
click = ">=7.1.2"
mypy-extensions = ">=0.4.3"
pathspec = ">=0.8.1,<1"
regex = ">=2020.1.8"
toml = ">=0.10.1"
typed-ast = {version = ">=1.4.2", markers = "python_version < \"3.8\""}
typing-extensions = {version = ">=3.7.4", markers = "python_version < \"3.8\""}
[package.extras]
colorama = ["colorama (>=0.4.3)"]
d = ["aiohttp (>=3.6.0)", "aiohttp-cors (>=0.4.0)"]
python2 = ["typed-ast (>=1.4.2)"]
uvloop = ["uvloop (>=0.15.2)"]
[[package]]
name = "certifi"
version = "2021.5.30"
description = "Python package for providing Mozilla's CA Bundle."
category = "main"
optional = false
python-versions = "*"
[[package]]
name = "chardet"
version = "4.0.0"
description = "Universal encoding detector for Python 2 and 3"
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
[[package]]
name = "click"
version = "8.0.1"
description = "Composable command line interface toolkit"
category = "main"
optional = false
python-versions = ">=3.6"
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
importlib-metadata = {version = "*", markers = "python_version < \"3.8\""}
[[package]]
name = "colorama"
version = "0.4.4"
description = "Cross-platform colored terminal text."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
[[package]]
name = "idna"
version = "2.10"
description = "Internationalized Domain Names in Applications (IDNA)"
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
[[package]]
name = "importlib-metadata"
version = "4.5.0"
description = "Read metadata from Python packages"
category = "main"
optional = false
python-versions = ">=3.6"
[package.dependencies]
typing-extensions = {version = ">=3.6.4", markers = "python_version < \"3.8\""}
zipp = ">=0.5"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=8.2)", "rst.linker (>=1.9)"]
testing = ["pytest (>=4.6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "packaging", "pep517", "pyfakefs", "flufl.flake8", "pytest-black (>=0.3.7)", "pytest-mypy", "importlib-resources (>=1.3)"]
[[package]]
name = "mypy-extensions"
version = "0.4.3"
description = "Experimental type system extensions for programs checked with the mypy typechecker."
category = "dev"
optional = false
python-versions = "*"
[[package]]
name = "pathspec"
version = "0.8.1"
description = "Utility library for gitignore style pattern matching of file paths."
category = "dev"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
[[package]]
name = "psycopg2-binary"
version = "2.9.1"
description = "psycopg2 - Python-PostgreSQL Database Adapter"
category = "main"
optional = false
python-versions = ">=3.6"
[[package]]
name = "regex"
version = "2021.4.4"
description = "Alternative regular expression module, to replace re."
category = "dev"
optional = false
python-versions = "*"
[[package]]
name = "requests"
version = "2.25.1"
description = "Python HTTP for Humans."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
[package.dependencies]
certifi = ">=2017.4.17"
chardet = ">=3.0.2,<5"
idna = ">=2.5,<3"
urllib3 = ">=1.21.1,<1.27"
[package.extras]
security = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)"]
socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"]
[[package]]
name = "toml"
version = "0.10.2"
description = "Python Library for Tom's Obvious, Minimal Language"
category = "dev"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
[[package]]
name = "tqdm"
version = "4.61.1"
description = "Fast, Extensible Progress Meter"
category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7"
[package.extras]
dev = ["py-make (>=0.1.0)", "twine", "wheel"]
notebook = ["ipywidgets (>=6)"]
telegram = ["requests"]
[[package]]
name = "typed-ast"
version = "1.4.3"
description = "a fork of Python 2 and 3 ast modules with type comment support"
category = "dev"
optional = false
python-versions = "*"
[[package]]
name = "typing-extensions"
version = "3.10.0.0"
description = "Backported and Experimental Type Hints for Python 3.5+"
category = "main"
optional = false
python-versions = "*"
[[package]]
name = "upend"
version = "0.1.0"
description = ""
category = "main"
optional = false
python-versions = "^3.7"
develop = true
[package.dependencies]
requests = "^2.25.1"
[package.source]
type = "directory"
url = "../upend"
[[package]]
name = "urllib3"
version = "1.26.5"
description = "HTTP library with thread-safe connection pooling, file post, and more."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
[package.extras]
brotli = ["brotlipy (>=0.6.0)"]
secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "certifi", "ipaddress"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
name = "zipp"
version = "3.4.1"
description = "Backport of pathlib-compatible object wrapper for zip files"
category = "main"
optional = false
python-versions = ">=3.6"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=8.2)", "rst.linker (>=1.9)"]
testing = ["pytest (>=4.6)", "pytest-checkdocs (>=1.2.3)", "pytest-flake8", "pytest-cov", "pytest-enabler", "jaraco.itertools", "func-timeout", "pytest-black (>=0.3.7)", "pytest-mypy"]
[metadata]
lock-version = "1.1"
python-versions = "^3.7"
content-hash = "ca9615ad2a499b8ab72751257d9b56f012ef4482b56ce205928d4f326e161787"
[metadata.files]
appdirs = [
{file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"},
{file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"},
]
black = [
{file = "black-21.6b0-py3-none-any.whl", hash = "sha256:dfb8c5a069012b2ab1e972e7b908f5fb42b6bbabcba0a788b86dc05067c7d9c7"},
{file = "black-21.6b0.tar.gz", hash = "sha256:dc132348a88d103016726fe360cb9ede02cecf99b76e3660ce6c596be132ce04"},
]
certifi = [
{file = "certifi-2021.5.30-py2.py3-none-any.whl", hash = "sha256:50b1e4f8446b06f41be7dd6338db18e0990601dce795c2b1686458aa7e8fa7d8"},
{file = "certifi-2021.5.30.tar.gz", hash = "sha256:2bbf76fd432960138b3ef6dda3dde0544f27cbf8546c458e60baf371917ba9ee"},
]
chardet = [
{file = "chardet-4.0.0-py2.py3-none-any.whl", hash = "sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5"},
{file = "chardet-4.0.0.tar.gz", hash = "sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa"},
]
click = [
{file = "click-8.0.1-py3-none-any.whl", hash = "sha256:fba402a4a47334742d782209a7c79bc448911afe1149d07bdabdf480b3e2f4b6"},
{file = "click-8.0.1.tar.gz", hash = "sha256:8c04c11192119b1ef78ea049e0a6f0463e4c48ef00a30160c704337586f3ad7a"},
]
colorama = [
{file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"},
{file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"},
]
idna = [
{file = "idna-2.10-py2.py3-none-any.whl", hash = "sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"},
{file = "idna-2.10.tar.gz", hash = "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6"},
]
importlib-metadata = [
{file = "importlib_metadata-4.5.0-py3-none-any.whl", hash = "sha256:833b26fb89d5de469b24a390e9df088d4e52e4ba33b01dc5e0e4f41b81a16c00"},
{file = "importlib_metadata-4.5.0.tar.gz", hash = "sha256:b142cc1dd1342f31ff04bb7d022492b09920cb64fed867cd3ea6f80fe3ebd139"},
]
mypy-extensions = [
{file = "mypy_extensions-0.4.3-py2.py3-none-any.whl", hash = "sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d"},
{file = "mypy_extensions-0.4.3.tar.gz", hash = "sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"},
]
pathspec = [
{file = "pathspec-0.8.1-py2.py3-none-any.whl", hash = "sha256:aa0cb481c4041bf52ffa7b0d8fa6cd3e88a2ca4879c533c9153882ee2556790d"},
{file = "pathspec-0.8.1.tar.gz", hash = "sha256:86379d6b86d75816baba717e64b1a3a3469deb93bb76d613c9ce79edc5cb68fd"},
]
psycopg2-binary = [
{file = "psycopg2-binary-2.9.1.tar.gz", hash = "sha256:b0221ca5a9837e040ebf61f48899926b5783668b7807419e4adae8175a31f773"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-macosx_10_14_x86_64.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl", hash = "sha256:c250a7ec489b652c892e4f0a5d122cc14c3780f9f643e1a326754aedf82d9a76"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aef9aee84ec78af51107181d02fe8773b100b01c5dfde351184ad9223eab3698"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:123c3fb684e9abfc47218d3784c7b4c47c8587951ea4dd5bc38b6636ac57f616"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-manylinux_2_24_aarch64.whl", hash = "sha256:995fc41ebda5a7a663a254a1dcac52638c3e847f48307b5416ee373da15075d7"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-manylinux_2_24_ppc64le.whl", hash = "sha256:fbb42a541b1093385a2d8c7eec94d26d30437d0e77c1d25dae1dcc46741a385e"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-win32.whl", hash = "sha256:20f1ab44d8c352074e2d7ca67dc00843067788791be373e67a0911998787ce7d"},
{file = "psycopg2_binary-2.9.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f6fac64a38f6768e7bc7b035b9e10d8a538a9fadce06b983fb3e6fa55ac5f5ce"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-macosx_10_14_x86_64.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl", hash = "sha256:1e3a362790edc0a365385b1ac4cc0acc429a0c0d662d829a50b6ce743ae61b5a"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f8559617b1fcf59a9aedba2c9838b5b6aa211ffedecabca412b92a1ff75aac1a"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a36c7eb6152ba5467fb264d73844877be8b0847874d4822b7cf2d3c0cb8cdcb0"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-manylinux_2_24_aarch64.whl", hash = "sha256:2f62c207d1740b0bde5c4e949f857b044818f734a3d57f1d0d0edc65050532ed"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-manylinux_2_24_ppc64le.whl", hash = "sha256:cfc523edecddaef56f6740d7de1ce24a2fdf94fd5e704091856a201872e37f9f"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-win32.whl", hash = "sha256:1e85b74cbbb3056e3656f1cc4781294df03383127a8114cbc6531e8b8367bf1e"},
{file = "psycopg2_binary-2.9.1-cp37-cp37m-win_amd64.whl", hash = "sha256:1473c0215b0613dd938db54a653f68251a45a78b05f6fc21af4326f40e8360a2"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-macosx_10_14_x86_64.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl", hash = "sha256:35c4310f8febe41f442d3c65066ca93cccefd75013df3d8c736c5b93ec288140"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c13d72ed6af7fd2c8acbd95661cf9477f94e381fce0792c04981a8283b52917"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14db1752acdd2187d99cb2ca0a1a6dfe57fc65c3281e0f20e597aac8d2a5bd90"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-manylinux_2_24_aarch64.whl", hash = "sha256:aed4a9a7e3221b3e252c39d0bf794c438dc5453bc2963e8befe9d4cd324dff72"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-manylinux_2_24_ppc64le.whl", hash = "sha256:da113b70f6ec40e7d81b43d1b139b9db6a05727ab8be1ee559f3a69854a69d34"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-win32.whl", hash = "sha256:4235f9d5ddcab0b8dbd723dca56ea2922b485ea00e1dafacf33b0c7e840b3d32"},
{file = "psycopg2_binary-2.9.1-cp38-cp38-win_amd64.whl", hash = "sha256:988b47ac70d204aed01589ed342303da7c4d84b56c2f4c4b8b00deda123372bf"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-macosx_10_14_x86_64.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl", hash = "sha256:7360647ea04db2e7dff1648d1da825c8cf68dc5fbd80b8fb5b3ee9f068dcd21a"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca86db5b561b894f9e5f115d6a159fff2a2570a652e07889d8a383b5fae66eb4"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ced67f1e34e1a450cdb48eb53ca73b60aa0af21c46b9b35ac3e581cf9f00e31"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-manylinux_2_24_aarch64.whl", hash = "sha256:0f2e04bd2a2ab54fa44ee67fe2d002bb90cee1c0f1cc0ebc3148af7b02034cbd"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-manylinux_2_24_ppc64le.whl", hash = "sha256:3242b9619de955ab44581a03a64bdd7d5e470cc4183e8fcadd85ab9d3756ce7a"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-win32.whl", hash = "sha256:0b7dae87f0b729922e06f85f667de7bf16455d411971b2043bbd9577af9d1975"},
{file = "psycopg2_binary-2.9.1-cp39-cp39-win_amd64.whl", hash = "sha256:b4d7679a08fea64573c969f6994a2631908bb2c0e69a7235648642f3d2e39a68"},
]
regex = [
{file = "regex-2021.4.4-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:619d71c59a78b84d7f18891fe914446d07edd48dc8328c8e149cbe0929b4e000"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:47bf5bf60cf04d72bf6055ae5927a0bd9016096bf3d742fa50d9bf9f45aa0711"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:281d2fd05555079448537fe108d79eb031b403dac622621c78944c235f3fcf11"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:bd28bc2e3a772acbb07787c6308e00d9626ff89e3bfcdebe87fa5afbfdedf968"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:7c2a1af393fcc09e898beba5dd59196edaa3116191cc7257f9224beaed3e1aa0"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:c38c71df845e2aabb7fb0b920d11a1b5ac8526005e533a8920aea97efb8ec6a4"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux2014_i686.whl", hash = "sha256:96fcd1888ab4d03adfc9303a7b3c0bd78c5412b2bfbe76db5b56d9eae004907a"},
{file = "regex-2021.4.4-cp36-cp36m-manylinux2014_x86_64.whl", hash = "sha256:ade17eb5d643b7fead300a1641e9f45401c98eee23763e9ed66a43f92f20b4a7"},
{file = "regex-2021.4.4-cp36-cp36m-win32.whl", hash = "sha256:e8e5b509d5c2ff12f8418006d5a90e9436766133b564db0abaec92fd27fcee29"},
{file = "regex-2021.4.4-cp36-cp36m-win_amd64.whl", hash = "sha256:11d773d75fa650cd36f68d7ca936e3c7afaae41b863b8c387a22aaa78d3c5c79"},
{file = "regex-2021.4.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:d3029c340cfbb3ac0a71798100ccc13b97dddf373a4ae56b6a72cf70dfd53bc8"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:18c071c3eb09c30a264879f0d310d37fe5d3a3111662438889ae2eb6fc570c31"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:4c557a7b470908b1712fe27fb1ef20772b78079808c87d20a90d051660b1d69a"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:01afaf2ec48e196ba91b37451aa353cb7eda77efe518e481707e0515025f0cd5"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:3a9cd17e6e5c7eb328517969e0cb0c3d31fd329298dd0c04af99ebf42e904f82"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:90f11ff637fe8798933fb29f5ae1148c978cccb0452005bf4c69e13db951e765"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux2014_i686.whl", hash = "sha256:919859aa909429fb5aa9cf8807f6045592c85ef56fdd30a9a3747e513db2536e"},
{file = "regex-2021.4.4-cp37-cp37m-manylinux2014_x86_64.whl", hash = "sha256:339456e7d8c06dd36a22e451d58ef72cef293112b559010db3d054d5560ef439"},
{file = "regex-2021.4.4-cp37-cp37m-win32.whl", hash = "sha256:67bdb9702427ceddc6ef3dc382455e90f785af4c13d495f9626861763ee13f9d"},
{file = "regex-2021.4.4-cp37-cp37m-win_amd64.whl", hash = "sha256:32e65442138b7b76dd8173ffa2cf67356b7bc1768851dded39a7a13bf9223da3"},
{file = "regex-2021.4.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1e1c20e29358165242928c2de1482fb2cf4ea54a6a6dea2bd7a0e0d8ee321500"},
{file = "regex-2021.4.4-cp38-cp38-manylinux1_i686.whl", hash = "sha256:314d66636c494ed9c148a42731b3834496cc9a2c4251b1661e40936814542b14"},
{file = "regex-2021.4.4-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:6d1b01031dedf2503631d0903cb563743f397ccaf6607a5e3b19a3d76fc10480"},
{file = "regex-2021.4.4-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:741a9647fcf2e45f3a1cf0e24f5e17febf3efe8d4ba1281dcc3aa0459ef424dc"},
{file = "regex-2021.4.4-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:4c46e22a0933dd783467cf32b3516299fb98cfebd895817d685130cc50cd1093"},
{file = "regex-2021.4.4-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:e512d8ef5ad7b898cdb2d8ee1cb09a8339e4f8be706d27eaa180c2f177248a10"},
{file = "regex-2021.4.4-cp38-cp38-manylinux2014_i686.whl", hash = "sha256:980d7be47c84979d9136328d882f67ec5e50008681d94ecc8afa8a65ed1f4a6f"},
{file = "regex-2021.4.4-cp38-cp38-manylinux2014_x86_64.whl", hash = "sha256:ce15b6d103daff8e9fee13cf7f0add05245a05d866e73926c358e871221eae87"},
{file = "regex-2021.4.4-cp38-cp38-win32.whl", hash = "sha256:a91aa8619b23b79bcbeb37abe286f2f408d2f2d6f29a17237afda55bb54e7aac"},
{file = "regex-2021.4.4-cp38-cp38-win_amd64.whl", hash = "sha256:c0502c0fadef0d23b128605d69b58edb2c681c25d44574fc673b0e52dce71ee2"},
{file = "regex-2021.4.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:598585c9f0af8374c28edd609eb291b5726d7cbce16be6a8b95aa074d252ee17"},
{file = "regex-2021.4.4-cp39-cp39-manylinux1_i686.whl", hash = "sha256:ee54ff27bf0afaf4c3b3a62bcd016c12c3fdb4ec4f413391a90bd38bc3624605"},
{file = "regex-2021.4.4-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:7d9884d86dd4dd489e981d94a65cd30d6f07203d90e98f6f657f05170f6324c9"},
{file = "regex-2021.4.4-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:bf5824bfac591ddb2c1f0a5f4ab72da28994548c708d2191e3b87dd207eb3ad7"},
{file = "regex-2021.4.4-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:563085e55b0d4fb8f746f6a335893bda5c2cef43b2f0258fe1020ab1dd874df8"},
{file = "regex-2021.4.4-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:b9c3db21af35e3b3c05764461b262d6f05bbca08a71a7849fd79d47ba7bc33ed"},
{file = "regex-2021.4.4-cp39-cp39-manylinux2014_i686.whl", hash = "sha256:3916d08be28a1149fb97f7728fca1f7c15d309a9f9682d89d79db75d5e52091c"},
{file = "regex-2021.4.4-cp39-cp39-manylinux2014_x86_64.whl", hash = "sha256:fd45ff9293d9274c5008a2054ecef86a9bfe819a67c7be1afb65e69b405b3042"},
{file = "regex-2021.4.4-cp39-cp39-win32.whl", hash = "sha256:fa4537fb4a98fe8fde99626e4681cc644bdcf2a795038533f9f711513a862ae6"},
{file = "regex-2021.4.4-cp39-cp39-win_amd64.whl", hash = "sha256:97f29f57d5b84e73fbaf99ab3e26134e6687348e95ef6b48cfd2c06807005a07"},
{file = "regex-2021.4.4.tar.gz", hash = "sha256:52ba3d3f9b942c49d7e4bc105bb28551c44065f139a65062ab7912bef10c9afb"},
]
requests = [
{file = "requests-2.25.1-py2.py3-none-any.whl", hash = "sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e"},
{file = "requests-2.25.1.tar.gz", hash = "sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804"},
]
toml = [
{file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
{file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
]
tqdm = [
{file = "tqdm-4.61.1-py2.py3-none-any.whl", hash = "sha256:aa0c29f03f298951ac6318f7c8ce584e48fa22ec26396e6411e43d038243bdb2"},
{file = "tqdm-4.61.1.tar.gz", hash = "sha256:24be966933e942be5f074c29755a95b315c69a91f839a29139bf26ffffe2d3fd"},
]
typed-ast = [
{file = "typed_ast-1.4.3-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:2068531575a125b87a41802130fa7e29f26c09a2833fea68d9a40cf33902eba6"},
{file = "typed_ast-1.4.3-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:c907f561b1e83e93fad565bac5ba9c22d96a54e7ea0267c708bffe863cbe4075"},
{file = "typed_ast-1.4.3-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:1b3ead4a96c9101bef08f9f7d1217c096f31667617b58de957f690c92378b528"},
{file = "typed_ast-1.4.3-cp35-cp35m-win32.whl", hash = "sha256:dde816ca9dac1d9c01dd504ea5967821606f02e510438120091b84e852367428"},
{file = "typed_ast-1.4.3-cp35-cp35m-win_amd64.whl", hash = "sha256:777a26c84bea6cd934422ac2e3b78863a37017618b6e5c08f92ef69853e765d3"},
{file = "typed_ast-1.4.3-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f8afcf15cc511ada719a88e013cec87c11aff7b91f019295eb4530f96fe5ef2f"},
{file = "typed_ast-1.4.3-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:52b1eb8c83f178ab787f3a4283f68258525f8d70f778a2f6dd54d3b5e5fb4341"},
{file = "typed_ast-1.4.3-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:01ae5f73431d21eead5015997ab41afa53aa1fbe252f9da060be5dad2c730ace"},
{file = "typed_ast-1.4.3-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:c190f0899e9f9f8b6b7863debfb739abcb21a5c054f911ca3596d12b8a4c4c7f"},
{file = "typed_ast-1.4.3-cp36-cp36m-win32.whl", hash = "sha256:398e44cd480f4d2b7ee8d98385ca104e35c81525dd98c519acff1b79bdaac363"},
{file = "typed_ast-1.4.3-cp36-cp36m-win_amd64.whl", hash = "sha256:bff6ad71c81b3bba8fa35f0f1921fb24ff4476235a6e94a26ada2e54370e6da7"},
{file = "typed_ast-1.4.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0fb71b8c643187d7492c1f8352f2c15b4c4af3f6338f21681d3681b3dc31a266"},
{file = "typed_ast-1.4.3-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:760ad187b1041a154f0e4d0f6aae3e40fdb51d6de16e5c99aedadd9246450e9e"},
{file = "typed_ast-1.4.3-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:5feca99c17af94057417d744607b82dd0a664fd5e4ca98061480fd8b14b18d04"},
{file = "typed_ast-1.4.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:95431a26309a21874005845c21118c83991c63ea800dd44843e42a916aec5899"},
{file = "typed_ast-1.4.3-cp37-cp37m-win32.whl", hash = "sha256:aee0c1256be6c07bd3e1263ff920c325b59849dc95392a05f258bb9b259cf39c"},
{file = "typed_ast-1.4.3-cp37-cp37m-win_amd64.whl", hash = "sha256:9ad2c92ec681e02baf81fdfa056fe0d818645efa9af1f1cd5fd6f1bd2bdfd805"},
{file = "typed_ast-1.4.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b36b4f3920103a25e1d5d024d155c504080959582b928e91cb608a65c3a49e1a"},
{file = "typed_ast-1.4.3-cp38-cp38-manylinux1_i686.whl", hash = "sha256:067a74454df670dcaa4e59349a2e5c81e567d8d65458d480a5b3dfecec08c5ff"},
{file = "typed_ast-1.4.3-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:7538e495704e2ccda9b234b82423a4038f324f3a10c43bc088a1636180f11a41"},
{file = "typed_ast-1.4.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:af3d4a73793725138d6b334d9d247ce7e5f084d96284ed23f22ee626a7b88e39"},
{file = "typed_ast-1.4.3-cp38-cp38-win32.whl", hash = "sha256:f2362f3cb0f3172c42938946dbc5b7843c2a28aec307c49100c8b38764eb6927"},
{file = "typed_ast-1.4.3-cp38-cp38-win_amd64.whl", hash = "sha256:dd4a21253f42b8d2b48410cb31fe501d32f8b9fbeb1f55063ad102fe9c425e40"},
{file = "typed_ast-1.4.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f328adcfebed9f11301eaedfa48e15bdece9b519fb27e6a8c01aa52a17ec31b3"},
{file = "typed_ast-1.4.3-cp39-cp39-manylinux1_i686.whl", hash = "sha256:2c726c276d09fc5c414693a2de063f521052d9ea7c240ce553316f70656c84d4"},
{file = "typed_ast-1.4.3-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:cae53c389825d3b46fb37538441f75d6aecc4174f615d048321b716df2757fb0"},
{file = "typed_ast-1.4.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:b9574c6f03f685070d859e75c7f9eeca02d6933273b5e69572e5ff9d5e3931c3"},
{file = "typed_ast-1.4.3-cp39-cp39-win32.whl", hash = "sha256:209596a4ec71d990d71d5e0d312ac935d86930e6eecff6ccc7007fe54d703808"},
{file = "typed_ast-1.4.3-cp39-cp39-win_amd64.whl", hash = "sha256:9c6d1a54552b5330bc657b7ef0eae25d00ba7ffe85d9ea8ae6540d2197a3788c"},
{file = "typed_ast-1.4.3.tar.gz", hash = "sha256:fb1bbeac803adea29cedd70781399c99138358c26d05fcbd23c13016b7f5ec65"},
]
typing-extensions = [
{file = "typing_extensions-3.10.0.0-py2-none-any.whl", hash = "sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497"},
{file = "typing_extensions-3.10.0.0-py3-none-any.whl", hash = "sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84"},
{file = "typing_extensions-3.10.0.0.tar.gz", hash = "sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342"},
]
upend = []
urllib3 = [
{file = "urllib3-1.26.5-py2.py3-none-any.whl", hash = "sha256:753a0374df26658f99d826cfe40394a686d05985786d946fbe4165b5148f5a7c"},
{file = "urllib3-1.26.5.tar.gz", hash = "sha256:a7acd0977125325f516bda9735fa7142b909a8d01e8b2e4c8108d0984e6e0098"},
]
zipp = [
{file = "zipp-3.4.1-py3-none-any.whl", hash = "sha256:51cb66cc54621609dd593d1787f286ee42a5c0adbb4b29abea5a63edc3e03098"},
{file = "zipp-3.4.1.tar.gz", hash = "sha256:3607921face881ba3e026887d8150cca609d517579abe052ac81fc5aeffdbd76"},
]

View File

@ -1,19 +0,0 @@
[tool.poetry]
authors = ["Tomáš Mládek <t@mldk.cz>"]
description = ""
name = "fromksx"
version = "0.1.0"
[tool.poetry.dependencies]
click = "^8.0.1"
colorama = "^0.4.4"
python = "^3.7"
tqdm = "^4.61.1"
upend = {path = "../upend", develop = true}
psycopg2-binary = "^2.9.1"
[tool.poetry.dev-dependencies]
black = {version = "^21.6b0", allow-prereleases = true}
[build-system]
build-backend = "poetry.core.masonry.api"
requires = ["poetry-core>=1.0.0"]

View File

@ -1,76 +0,0 @@
import test from "ava";
import { Any, Query, Variable } from "./query";
test("query matches simple", (t) => {
const query = Query.matches("entity", "attribute", "value");
t.is(query.toString(), '(matches entity "attribute" "value")');
});
test("query matches anything", (t) => {
const query = Query.matches(Any, Any, Any);
t.is(query.toString(), "(matches ? ? ?)");
});
test("query matches array", (t) => {
const query = Query.matches("entity", "attribute", ["value1", "value2"]);
t.is(query.toString(), '(matches entity "attribute" (in "value1" "value2"))');
});
test("query matches addresses", (t) => {
const query = Query.matches("entity", "attribute", [
"@address1",
"@address2",
]);
t.is(
query.toString(),
'(matches entity "attribute" (in @address1 @address2))'
);
});
test("query matches numbers", (t) => {
const query = Query.matches("entity", "attribute", [1, 2]);
t.is(query.toString(), '(matches entity "attribute" (in 1 2))');
});
test("query matches variables", (t) => {
const query = Query.matches("entity", "attribute", Variable("a"));
t.is(query.toString(), '(matches entity "attribute" ?a)');
});
test("OR queries", (t) => {
const query = Query.or(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2")
);
t.is(
query.toString(),
'(or (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))'
);
});
test("AND queries", (t) => {
const query = Query.and(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2")
);
t.is(
query.toString(),
'(and (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))'
);
});
test("NOT query", (t) => {
const query = Query.not(Query.matches("entity", "attribute1", "value2"));
t.is(query.toString(), '(not (matches entity "attribute1" "value2"))');
});
test("JOIN queries", (t) => {
const query = Query.join(
Query.matches("entity", "attribute1", "value2"),
Query.matches("entity", "attribute2", "value2")
);
t.is(
query.toString(),
'(join (matches entity "attribute1" "value2") (matches entity "attribute2" "value2"))'
);
});

View File

@ -1,11 +0,0 @@
{
"compilerOptions": {
"target": "es5",
"lib": ["es2019", "DOM"],
"module": "commonjs",
"importsNotUsedAsValues": "error",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true
}
}

View File

@ -9,5 +9,5 @@ crate-type = ["cdylib", "rlib"]
[dependencies]
wasm-bindgen = "0.2.87"
upend-base = { path = "../../base", features = ["wasm"] }
upend-base = { path = "../base", features = ["wasm"] }
wee_alloc = "0.4.5"

View File

@ -22,7 +22,7 @@
"svelte": "^3.55.0",
"svelte-preprocess": "^5.0.3",
"typescript": "^4.9.4",
"upend": "file:../tools/upend_js",
"upend": "file:../sdks/js",
"vite": "^4.0.3",
"vite-plugin-static-copy": "^0.15.0",
"web-ext": "^7.6.2"

View File

@ -27,8 +27,8 @@ dependencies:
specifier: ^4.9.4
version: 4.9.5
upend:
specifier: file:../tools/upend_js
version: file:../tools/upend_js
specifier: file:../sdks/js
version: file:../sdks/js
vite:
specifier: ^4.0.3
version: 4.3.8(sass@1.62.1)
@ -633,6 +633,14 @@ packages:
eslint-visitor-keys: 3.4.1
dev: true
/@upnd/wasm-node@0.1.0:
resolution: {integrity: sha512-lOW3Fqc48bj90r84+6zyvQHATInK6aFwUXCBmLk048e6RnfO7ELQSyYp9nso/efmrdcaRKnT+Cc5OObk1amblA==}
dev: false
/@upnd/wasm-web@0.1.0:
resolution: {integrity: sha512-GFZYYKTbaFrefCKYL+GCgEyMpTD9CbrhCAel3/o5gmHk7xNL3ox5N9AiCZFmkPbFWIE070Ow3zg+kQf2GY34rA==}
dev: false
/@webcomponents/webcomponentsjs@2.8.0:
resolution: {integrity: sha512-loGD63sacRzOzSJgQnB9ZAhaQGkN7wl2Zuw7tsphI5Isa0irijrRo6EnJii/GgjGefIFO8AIO7UivzRhFaEk9w==}
dev: false
@ -2022,6 +2030,7 @@ packages:
/glob@6.0.4:
resolution: {integrity: sha512-MKZeRNyYZAVVVG1oZeLaWie1uweH40m9AZwIwxyPbTSX4hHrVYSzLg0Ro5Z5R7XKkIX+Cc6oD1rqeDJnwsB8/A==}
requiresBuild: true
dependencies:
inflight: 1.0.6
inherits: 2.0.4
@ -2930,6 +2939,7 @@ packages:
/nan@2.17.0:
resolution: {integrity: sha512-2ZTgtl0nJsO0KQCjEpxcIr5D+Yv90plTitZt9JBfQvVJDS5seMl3FOvsh3+9CoYWXf/1l5OaZzzF6nDm4cagaQ==}
requiresBuild: true
dev: false
optional: true
@ -2949,6 +2959,7 @@ packages:
/ncp@2.0.0:
resolution: {integrity: sha512-zIdGUrPRFTUELUvr3Gmc7KZ2Sw/h1PiVM0Af/oHB6zgnV1ikqSfRk+TOufi79aHYCW3NiOXmr1BP5nWbzojLaA==}
hasBin: true
requiresBuild: true
dev: false
optional: true
@ -3622,6 +3633,7 @@ packages:
/rimraf@2.4.5:
resolution: {integrity: sha512-J5xnxTyqaiw06JjMftq7L9ouA448dw/E7dKghkP9WpKNuwmARNNg+Gk8/u5ryb9N/Yo2+z3MCwuqFK/+qPOPfQ==}
hasBin: true
requiresBuild: true
dependencies:
glob: 6.0.4
dev: false
@ -4657,18 +4669,14 @@ packages:
jszip: 3.10.1
dev: false
file:../tools/upend_js:
resolution: {directory: ../tools/upend_js, type: directory}
name: upend
file:../sdks/js:
resolution: {directory: ../sdks/js, type: directory}
name: '@upnd/upend'
dependencies:
'@upnd/wasm-node': 0.1.0
'@upnd/wasm-web': 0.1.0
debug: 4.3.4
lru-cache: 7.18.3
upend_wasm: file:../tools/upend_wasm/pkg
transitivePeerDependencies:
- supports-color
dev: false
file:../tools/upend_wasm/pkg:
resolution: {directory: ../tools/upend_wasm/pkg, type: directory}
name: upend_wasm
dev: false

View File

@ -312,7 +312,7 @@
</main>
<style lang="scss">
@use "../../webui/src/styles/colors";
@use "../../webui/src/lib/styles/colors";
main {
padding: 1em;

View File

@ -1,3 +1,3 @@
@use "../webui/src/styles/common";
@use "../webui/src/styles/colors-app";
@use "../webui/src/styles/fonts";
@use "../../webui/src/lib/styles/common";
@use "../../webui/src/lib/styles/colors-app";
@use "../../webui/src/lib/styles/fonts";

View File

@ -5,7 +5,8 @@ module.exports = {
'eslint:recommended',
'plugin:@typescript-eslint/recommended',
'plugin:svelte/recommended',
'prettier'
'prettier',
'plugin:storybook/recommended'
],
parser: '@typescript-eslint/parser',
plugins: ['@typescript-eslint'],

3
webui/.gitignore vendored
View File

@ -10,3 +10,6 @@ vite.config.js.timestamp-*
vite.config.ts.timestamp-*
/static/vendor
# Sentry Config File
.sentryclirc

26
webui/.storybook/main.ts Normal file
View File

@ -0,0 +1,26 @@
import type { StorybookConfig } from '@storybook/sveltekit';
const config: StorybookConfig = {
stories: ['../src/**/*.mdx', '../src/**/*.stories.@(js|jsx|mjs|ts|tsx)'],
addons: [
'@storybook/addon-links',
'@storybook/addon-essentials',
'@storybook/addon-interactions'
],
framework: {
name: '@storybook/sveltekit',
options: {}
},
docs: {
autodocs: 'tag'
},
viteFinal: (config) => {
config.server!.proxy = {
'/api': {
target: 'http://localhost:8099/'
}
};
return config;
}
};
export default config;

View File

@ -0,0 +1,16 @@
import type { Preview } from '@storybook/svelte';
import '../src/lib/styles/main.scss';
const preview: Preview = {
parameters: {
actions: { argTypesRegex: '^on[A-Z].*' },
controls: {
matchers: {
color: /(background|color)$/i,
date: /Date$/i
}
}
}
};
export default preview;

View File

@ -4,15 +4,25 @@
"private": true,
"type": "module",
"scripts": {
"dev": "vite dev",
"dev": "vite dev --force",
"build": "vite build",
"preview": "vite preview",
"check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json",
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
"lint": "prettier --check . && eslint .",
"format": "prettier --write ."
"format": "prettier --write .",
"storybook": "concurrently \"npm run storybook:serve\" \"cargo run -- serve ../example_vault --clean --no-browser --reinitialize --rescan-mode mirror --bind 127.0.0.1:8099\"",
"storybook:serve": "storybook dev -p 6006",
"build-storybook": "storybook build"
},
"devDependencies": {
"@storybook/addon-essentials": "^7.6.16",
"@storybook/addon-interactions": "^7.6.16",
"@storybook/addon-links": "^7.6.16",
"@storybook/blocks": "^7.6.16",
"@storybook/svelte": "^7.6.16",
"@storybook/sveltekit": "^7.6.16",
"@storybook/test": "^7.6.16",
"@sveltejs/adapter-auto": "^3.0.0",
"@sveltejs/kit": "^2.0.0",
"@sveltejs/vite-plugin-svelte": "^3.0.0",
@ -21,9 +31,11 @@
"@typescript-eslint/parser": "^6.0.0",
"eslint": "^8.56.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-storybook": "^0.8.0",
"eslint-plugin-svelte": "^2.35.1",
"prettier": "^3.1.1",
"prettier-plugin-svelte": "^3.1.2",
"storybook": "^7.6.16",
"svelte": "^4.2.7",
"svelte-check": "^3.6.0",
"tslib": "^2.4.1",
@ -33,6 +45,7 @@
"dependencies": {
"@ibm/plex": "^6.3.0",
"@recogito/annotorious": "^2.7.11",
"@sentry/sveltekit": "^7.109.0",
"@sveltejs/adapter-static": "^3.0.1",
"@types/d3": "^7.4.3",
"@types/debug": "^4.1.12",
@ -42,8 +55,8 @@
"@types/node": "^18.19.8",
"@types/three": "^0.160.0",
"@types/wavesurfer.js": "^6.0.12",
"@upnd/upend": "file:../tools/upend_js",
"@upnd/wasm-web": "file:../tools/upend_wasm/pkg-web",
"@upnd/upend": "file:../sdks/js",
"@upnd/wasm-web": "file:../wasm/pkg-web",
"boxicons": "^2.1.4",
"d3": "^7.8.5",
"date-fns": "^2.30.0",

File diff suppressed because it is too large Load Diff

23
webui/src/hooks.client.ts Normal file
View File

@ -0,0 +1,23 @@
import * as Sentry from '@sentry/sveltekit';
import { handleErrorWithSentry, replayIntegration } from '@sentry/sveltekit';
Sentry.init({
dsn: 'https://9b966481f93abec5d80f1d48dd1afec1@o4506614333308928.ingest.us.sentry.io/4506614334423040',
tracesSampleRate: 1.0,
// This sets the sample rate to be 10%. You may want this to be 100% while
// in development and sample at a lower rate in production
replaysSessionSampleRate: 0.1,
// If the entire session is not sampled, use the below sample rate to sample
// sessions when an error occurs.
replaysOnErrorSampleRate: 1.0,
// If you don't want to use Session Replay, just remove the line below:
integrations: [replayIntegration(), Sentry.feedbackIntegration({ colorScheme: 'dark' })],
enabled: process.env.NODE_ENV !== 'development'
});
// If you have a custom error handler, pass it to `handleErrorWithSentry`
export const handleError = handleErrorWithSentry();

17
webui/src/hooks.server.ts Normal file
View File

@ -0,0 +1,17 @@
import { sequence } from '@sveltejs/kit/hooks';
import * as Sentry from '@sentry/sveltekit';
import { handleErrorWithSentry, sentryHandle } from '@sentry/sveltekit';
Sentry.init({
dsn: 'https://9b966481f93abec5d80f1d48dd1afec1@o4506614333308928.ingest.us.sentry.io/4506614334423040',
tracesSampleRate: 1.0
// uncomment the line below to enable Spotlight (https://spotlightjs.com)
// spotlight: import.meta.env.DEV,
});
// If you have custom handlers, make sure to place them after `sentryHandle()` in the `sequence` function.
export const handle = sequence(sentryHandle());
// If you have a custom error handler, pass it to `handleErrorWithSentry`
export const handleError = handleErrorWithSentry();

View File

@ -1,6 +1,29 @@
import { UpEndApi } from '@upnd/upend';
import { UpEndWasmExtensionsWeb } from '@upnd/upend/wasm/web';
import wasmURL from '@upnd/wasm-web/upend_wasm_bg.wasm?url';
import { type StartStopNotifier, writable, type Writable } from 'svelte/store';
import * as Sentry from '@sentry/sveltekit';
const wasm = new UpEndWasmExtensionsWeb(wasmURL);
export default new UpEndApi({ instanceUrl: '/', wasmExtensions: wasm });
const api = new UpEndApi({ instanceUrl: '/', wasmExtensions: wasm });
export default api;
export const currentUser: Writable<string | undefined> = writable(
undefined as string | undefined,
((set) => {
api.authStatus().then((result) => {
set(result?.user);
Sentry.setUser({ id: result?.user });
});
}) as StartStopNotifier<string | undefined>
);
export async function login(credentials: { username: string; password: string }) {
await api.authenticate(credentials);
window.location.reload();
}
export async function logout() {
await api.resetAuth();
window.location.reload();
}

View File

@ -1,9 +1,12 @@
<script context="module" lang="ts">
import mitt from 'mitt';
import type { Address } from '@upnd/upend/types';
export type AddEvents = {
choose: void;
files: File[];
urls: string[];
destination: Address;
};
export const addEmitter = mitt<AddEvents>();
</script>
@ -11,14 +14,28 @@
<script lang="ts">
import Icon from './utils/Icon.svelte';
import IconButton from './utils/IconButton.svelte';
import ProgressBar from './utils/ProgressBar.svelte';
import api from '$lib/api';
import { goto } from '$app/navigation';
import { i18n } from '$lib/i18n';
import { selected } from '$lib/components/EntitySelect.svelte';
import Modal from '$lib/components/layout/Modal.svelte';
import Selector, { type SelectorValue } from '$lib/components/utils/Selector.svelte';
import { ATTR_IN } from '@upnd/upend/constants';
let files: File[] = [];
let URLs: string[] = [];
let uploading = false;
let abortController: AbortController | undefined;
$: visible = files.length + URLs.length > 0;
let destination: Address | undefined;
let progress: Record<string, number> = {};
let totalProgress: number | undefined;
let filesElement: HTMLDivElement;
$: visible = files.length + URLs.length > 0 || destination;
addEmitter.on('files', (ev) => {
ev.forEach((file) => {
@ -29,13 +46,60 @@
});
});
addEmitter.on('destination', (ev) => {
destination = ev;
});
function onDestinationSelected(ev: CustomEvent<SelectorValue | undefined>) {
if (ev.detail?.t === 'Address') {
destination = ev.detail.c;
} else {
destination = undefined;
}
}
async function upload() {
uploading = true;
try {
const addresses = await Promise.all(files.map(async (file) => api.putBlob(file)));
abortController = new AbortController();
const addresses: string[] = [];
for (const [idx, file] of files.entries()) {
filesElement
?.querySelectorAll('.entry')
[idx]?.scrollIntoView({ behavior: 'smooth', block: 'center' });
goto(`/browse/${addresses.join(',')}`);
const address = await api.putBlob(file, {
abortController,
onProgress: (p) => {
progress[file.name] = (p.loaded / p.total) * 100;
totalProgress = Object.values(progress).reduce((a, b) => a + b, 0) / files.length;
},
timeout: -1
});
if (destination) {
await api.putEntry({
entity: address,
attribute: ATTR_IN,
value: {
t: 'Address',
c: destination
}
});
}
addresses.push(address);
if (!uploading) {
break;
}
}
if (addresses.length == 1) {
goto(`/browse/${addresses[0]}`);
} else {
$selected = addresses;
goto(`/browse/selected`);
}
} catch (error) {
alert(error);
}
@ -45,41 +109,106 @@
}
function reset() {
if (!uploading) {
files = [];
URLs = [];
if (uploading) {
const msg = $i18n.t('Are you sure you want to cancel the upload?');
if (!confirm(msg)) return;
}
abortController?.abort();
files = [];
URLs = [];
progress = {};
uploading = false;
destination = undefined;
}
function onKeydown(event: KeyboardEvent) {
if (!files.length) return;
if (event.key === 'Enter') {
event.preventDefault();
upload();
}
if (event.key === 'Escape') {
reset();
}
}
function onBeforeUnload(ev: BeforeUnloadEvent) {
if (files.length || uploading) {
ev.preventDefault();
ev.returnValue = true;
}
}
</script>
<svelte:body on:keydown={(ev) => ev.key === 'Escape' && reset()} />
<svelte:window on:beforeunload={onBeforeUnload} />
<svelte:body on:keydown={onKeydown} />
<!-- svelte-ignore a11y-click-events-have-key-events a11y-no-static-element-interactions -->
<div class="addmodal-container" class:visible class:uploading on:click={reset}>
<div class="addmodal" on:click|stopPropagation>
<div class="files">
{#if visible}
<Modal on:close={reset}>
<div class="files" bind:this={filesElement}>
{#each files as file}
<div class="file">
{#if file.type.startsWith('image')}
<img src={URL.createObjectURL(file)} alt="To be uploaded." />
{:else}
<div class="icon">
<Icon name="file" />
<div class="entry">
<div class="row">
<div class="file">
<div class="icon">
<Icon name="file" />
</div>
<div class="label">{file.name}</div>
{#if file.type.startsWith('image')}
<img src={URL.createObjectURL(file)} alt="To be uploaded." />
{/if}
</div>
{#if !uploading}
<IconButton
small
subdued
color="#dc322f"
name="x-circle"
on:click={() => (files = files.filter((f) => f !== file))}
/>
{/if}
</div>
{#if uploading && files.length > 1}
<div class="progress">
<ProgressBar value={progress[file.name] || 0} />
</div>
{/if}
<div class="label">{file.name}</div>
</div>
{/each}
{#if !uploading}
<div class="entry add">
<IconButton outline name="plus-circle" on:click={() => addEmitter.emit('choose')} />
</div>
{/if}
</div>
<div class="controls">
<IconButton name="upload" on:click={upload} />
<div class="controls-destination">
<div class="label"><Icon plain name="download" /> {$i18n.t('Destination')}</div>
<Selector
initial={destination ? { t: 'Address', c: destination } : undefined}
types={['Address', 'NewAddress']}
placeholder={$i18n.t('Choose automatically') || ''}
on:input={onDestinationSelected}
/>
</div>
<div class="controls-submit">
<IconButton small disabled={uploading} name="upload" on:click={upload}>
{$i18n.t('Upload')}
</IconButton>
</div>
</div>
</div>
</div>
{#if uploading}
<div class="progress">
<ProgressBar value={totalProgress} />
</div>
{/if}
</Modal>
{/if}
<style lang="scss">
.addmodal-container {
position: absolute;
position: fixed;
left: 0;
top: 0;
width: 100vw;
@ -88,6 +217,7 @@
color: var(--foreground);
display: none;
&.visible {
display: unset;
}
@ -96,9 +226,11 @@
cursor: progress;
.addmodal {
filter: brightness(0.5);
filter: brightness(0.85);
}
}
z-index: 99;
}
.addmodal {
@ -113,6 +245,7 @@
border: solid 2px var(--foreground);
border-radius: 8px;
padding: 1rem;
min-width: 33vw;
}
.files {
@ -124,12 +257,20 @@
overflow-y: auto;
max-height: 66vh;
width: 80vw;
}
.entry .row {
display: flex;
align-items: center;
gap: 0.5rem;
}
.file {
flex-grow: 1;
display: flex;
align-items: center;
flex-direction: column;
gap: 1rem;
border: 1px solid var(--foreground);
border-radius: 4px;
@ -147,14 +288,39 @@
.label {
flex-grow: 1;
text-align: center;
}
}
.entry.add {
display: flex;
flex-direction: column;
font-size: 1.5em;
}
.controls {
display: flex;
justify-content: center;
font-size: 48px;
align-items: center;
font-size: 3em;
margin-top: 0.5rem;
gap: 1rem;
}
.controls-destination {
display: flex;
flex-direction: column;
gap: 0.5rem;
font-size: 1rem;
flex-grow: 3;
}
.controls-submit {
margin: 0 1rem;
}
.progress {
margin-top: 0.5rem;
margin-bottom: 0.5rem;
}
</style>

View File

@ -11,7 +11,6 @@
export let address: string | undefined = undefined;
export let index: number;
export let only: boolean;
export let background = 'var(--background-lighter)';
export let forceDetail = false;
let shifted = false;
let key = Math.random();
@ -70,12 +69,7 @@
</script>
<!-- svelte-ignore a11y-no-static-element-interactions -->
<div
class="browse-column"
class:detail
style="--background-color: {background}"
on:mousemove={(ev) => (shifted = ev.shiftKey)}
>
<div class="browse-column" class:detail on:mousemove={(ev) => (shifted = ev.shiftKey)}>
<div class="view" style="--width: {width}px">
<header>
{#if address}
@ -120,7 +114,11 @@
</div>
<style lang="scss">
@use 'sass:color';
@use '../styles/colors.scss';
.browse-column {
position: relative;
display: flex;
}
@ -144,12 +142,14 @@
display: flex;
flex-direction: column;
background: var(--background-color);
background: var(--background-lighter);
color: var(--foreground-lighter);
border: 1px solid var(--foreground-lightest);
border-radius: 0.5em;
padding: 1rem;
z-index: 1;
// transition: min-width 0.2s, max-width 0.2s;
// TODO - container has nowhere to scroll, breaking `detail` scroll
@ -165,6 +165,25 @@
}
}
.browse-column.image-background {
--foreground: var(--foreground-lighter);
--background-lighter: rgba(255, 255, 255, 0.08);
--color-active: #{color.scale(colors.$orange, $alpha: -75%)};
}
.background {
position: absolute;
top: 0;
left: 0;
width: calc(100% - 0.5rem);
height: 100%;
border-radius: 0.5em;
background-size: cover;
background-position: center;
filter: blur(0.5rem) brightness(0.3) contrast(1.1);
z-index: 0;
}
.resizeHandle {
cursor: ew-resize;
height: 100%;

View File

@ -5,12 +5,19 @@
import { i18n } from '../i18n';
import LabelBorder from './utils/LabelBorder.svelte';
import { createEventDispatcher } from 'svelte';
const dispatch = createEventDispatcher();
import { type Address } from '@upnd/upend/types';
import Icon from '$lib/components/utils/Icon.svelte';
const dispatch = createEventDispatcher<{
highlighted: string | undefined;
add: Address;
remove: Address;
}>();
export let entities: string[];
export let hide = false;
export let header = '';
export let icon: string | null = null;
export let confirmRemoveMessage: string | null = $i18n.t('Are you sure you want to remove this?');
export let emptyMessage = $i18n.t('Nothing to show.');
@ -19,8 +26,8 @@
$: if (adding && selector) selector.focus();
async function add(ev: CustomEvent<SelectorValue>) {
if (ev.detail.t !== 'Address') {
async function add(ev: CustomEvent<SelectorValue | undefined>) {
if (ev.detail?.t !== 'Address') {
return;
}
dispatch('add', ev.detail.c);
@ -34,7 +41,10 @@
</script>
<LabelBorder {hide}>
<span slot="header">{header}</span>
<span slot="header"
>{#if icon}<Icon plain name={icon} />
{/if}{header}</span
>
{#if adding}
<div class="selector">

Some files were not shown because too many files have changed in this diff Show More