55 Commits

Author SHA1 Message Date
58603c5180 Run multiple image download queue in parallel (#1203) 2025-12-06 01:51:36 +01:00
a429b0ace9 Run multiple image download queue in parallel 2025-12-06 01:49:16 +01:00
0f62854128 Fix images recycling keys 2025-12-06 01:49:16 +01:00
2deeaaf97e Enable docker cross compile for front & auth 2025-12-06 01:49:16 +01:00
6f07e51a07 Handle duplicated studios (#1202) 2025-12-06 00:51:27 +01:00
c839fc826e Fix front type for original 2025-12-06 00:48:19 +01:00
10ac7e1ec6 Handle duplicated studios 2025-12-06 00:48:19 +01:00
79075e497d Lots of api fixes + error api for scanner (#1201) 2025-12-06 00:06:25 +01:00
8109b7ada6 Format stuff 2025-12-05 23:42:52 +01:00
30f26b2f6a Allow insert without original translation 2025-12-05 23:38:18 +01:00
a1b975cc5d Delete timedout running requests 2025-12-04 17:58:32 +01:00
4f2b2d2cd2 Handle seasons with holes in episode numbers 2025-12-04 17:58:32 +01:00
d3ccd14fe0 Fix sqlarr 2025-12-04 17:58:32 +01:00
7f5bc2f57c Fix logout on deleted accounts 2025-12-04 17:58:32 +01:00
c2c9bbe555 Prevent duplicated staff members 2025-12-04 17:58:32 +01:00
20e6fbbc33 Remove well-known from otel 2025-12-04 17:58:31 +01:00
5f9064ec37 Prevent all scanner slave to process requests 2025-12-04 17:58:31 +01:00
433b90a3fb Add requests errors in db and api 2025-12-04 17:58:31 +01:00
81c6f68509 Fix shell.nix for sharp 2025-12-04 17:58:31 +01:00
96ac331903 Fix downloaded images volume on docker 2025-12-04 17:58:31 +01:00
renovate[bot]
f1c2724a7b chore(deps): update traefik docker tag to v3.6 (#1196)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-04 12:12:42 +00:00
acelinkio
12fe7c157f .github remove autosync references + fix whitespace (#1198) 2025-12-02 09:26:04 +01:00
c29ad99ca0 Fix pg admin password (#1186) 2025-12-02 09:22:49 +01:00
acelinkio
a99f29074c scanner: adding the probes back (#1197) 2025-12-01 18:30:23 -08:00
Arlan Lloyd
f449a0878a adding the probes back 2025-12-02 02:26:34 +00:00
acelinkio
097985ab6d scanner: refactor otel integration (#1194) 2025-12-01 23:50:28 +01:00
11c300ecf7 Add status api to get scanner's status (#1195) 2025-12-01 20:18:21 +01:00
1e975ce238 Set null not distinct in scanner request constraint 2025-12-01 20:15:57 +01:00
b39fa4262d Add status api to get scanner's status 2025-12-01 20:04:16 +01:00
d7699389bc Fix missing kid in apikeys jwt 2025-12-01 20:04:16 +01:00
renovate[bot]
1036e9f3f3 chore(deps): update dependency @biomejs/biome to v2.3.7 (#1189)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 10:36:11 +01:00
renovate[bot]
b4749f3ed3 fix(deps): update aws-sdk-go-v2 monorepo (#1191)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 10:34:25 +01:00
renovate[bot]
a20c61206f fix(deps): update module github.com/labstack/echo-jwt/v4 to v4.4.0 (#1192)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 10:34:14 +01:00
renovate[bot]
0644a43cb1 chore(deps): update actions/checkout action to v6 (#1193)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 10:34:05 +01:00
af4742ae0b Fix sqlarr of api (#1188) 2025-11-30 19:29:46 +00:00
acelinkio
e401ca98c0 downgrade cloudpirates postgres 0.12.0 (#1187) 2025-11-28 13:15:55 -08:00
Arlan Lloyd
a756c875fd 0.12.1 has a bug with rendering nested values 2025-11-28 21:13:10 +00:00
acelinkio
2ef26e5d02 add devspace (#1173) 2025-11-28 20:52:10 +01:00
acelinkio
e7d9002156 kyoo_api logs redact password & other sensitive fields (#1182) 2025-11-28 16:42:27 +00:00
acelinkio
28d2e193aa kyoo_api extension install specify schema (#1183) 2025-11-28 16:39:47 +00:00
ce5bee11c0 Use unnest in insertion methods (#1185) 2025-11-28 17:28:11 +01:00
60d59d7f7b Wrap every insert with a trace 2025-11-28 17:25:29 +01:00
464d720ef9 Fix unnest issues 2025-11-28 17:11:43 +01:00
8fc279d2ed Use unnest everywhere 2025-11-28 17:11:43 +01:00
a45e992339 Properly type unnestValues return 2025-11-28 17:11:43 +01:00
5f8ddd435a Use unnest for entries 2025-11-28 17:11:43 +01:00
d822463fe0 Add a trace for api migrations 2025-11-28 17:11:43 +01:00
acelinkio
3a0cbf786d fix(deps): update aws-sdk-go-v2 monorepo (#1159) 2025-11-24 13:12:25 -08:00
renovate[bot]
dfb4777a5d fix(deps): update aws-sdk-go-v2 monorepo 2025-11-24 20:29:44 +00:00
acelinkio
eea32c47e9 chore(deps): update postgres docker tag to v0.12.1 (#1181) 2025-11-24 09:49:58 -08:00
renovate[bot]
6bcd03b18e chore(deps): update postgres docker tag to v0.12.1 2025-11-24 13:46:46 +00:00
acelinkio
87a3df6897 chore(deps): update dependency @biomejs/biome to v2.3.6 (#1179) 2025-11-23 18:38:41 -08:00
acelinkio
7f7a16e9b5 chore(deps): update postgres docker tag to v0.12.0 (#1180) 2025-11-23 18:38:17 -08:00
renovate[bot]
b95dd9056b chore(deps): update postgres docker tag to v0.12.0 2025-11-24 02:22:25 +00:00
renovate[bot]
5044f941b1 chore(deps): update dependency @biomejs/biome to v2.3.6 2025-11-24 02:08:06 +00:00
77 changed files with 1371 additions and 802 deletions

View File

@@ -38,7 +38,7 @@ PUBLIC_URL=http://localhost:8901
# Set `verified` to true if you don't wanna manually verify users.
EXTRA_CLAIMS='{"permissions": ["core.read", "core.play"], "verified": false}'
# This is the permissions of the first user (aka the first user is admin)
FIRST_USER_CLAIMS='{"permissions": ["users.read", "users.write", "apikeys.read", "apikeys.write", "users.delete", "core.read", "core.write", "core.play", "scanner.trigger"], "verified": true}'
FIRST_USER_CLAIMS='{"permissions": ["users.read", "users.write", "users.delete", "apikeys.read", "apikeys.write", "core.read", "core.write", "core.play", "scanner.trigger"], "verified": true}'
# Guest (meaning unlogged in users) can be:
# unauthorized (they need to connect before doing anything)

View File

@@ -26,7 +26,7 @@ jobs:
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- uses: oven-sh/setup-bun@v2
- name: Install dependencies

View File

@@ -15,7 +15,7 @@ jobs:
postgres:
image: postgres:15
ports:
- "5432:5432"
- "5432:5432"
env:
POSTGRES_USER: kyoo
POSTGRES_PASSWORD: password
@@ -25,7 +25,7 @@ jobs:
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- uses: gacts/install-hurl@v1

View File

@@ -9,7 +9,7 @@ jobs:
run:
working-directory: ./api
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Setup Biome
uses: biomejs/setup-biome@v2
@@ -26,7 +26,7 @@ jobs:
run:
working-directory: ./front
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Setup Biome
uses: biomejs/setup-biome@v2
@@ -37,10 +37,10 @@ jobs:
run: biome ci .
scanner:
name: "Lint scanner/autosync"
name: "Lint scanner"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- uses: chartboost/ruff-action@v1
with:
@@ -53,7 +53,7 @@ jobs:
run:
working-directory: ./transcoder
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Run go fmt
run: if [ "$(gofmt -s -l . | wc -l)" -gt 0 ]; then exit 1; fi
@@ -65,7 +65,7 @@ jobs:
run:
working-directory: ./auth
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Run go fmt
run: if [ "$(gofmt -s -l . | wc -l)" -gt 0 ]; then exit 1; fi

View File

@@ -34,11 +34,6 @@ jobs:
label: scanner
image: ${{ github.repository_owner }}/kyoo_scanner
- context: ./autosync
dockerfile: Dockerfile
label: autosync
image: ${{ github.repository_owner }}/kyoo_autosync
- context: ./transcoder
dockerfile: Dockerfile
label: transcoder
@@ -52,7 +47,7 @@ jobs:
DOCKERHUB_ENABLED: ${{ secrets.DOCKER_USERNAME && secrets.DOCKER_PASSWORD && 'true' || 'false' }}
name: Build ${{matrix.label}}
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- uses: dorny/paths-filter@v3
id: filter

View File

@@ -15,7 +15,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v6
- name: Set up Helm
uses: azure/setup-helm@v4

View File

@@ -2,7 +2,7 @@ name: Native build
on:
push:
tags:
- v*
- v*
jobs:
update:
@@ -13,7 +13,7 @@ jobs:
working-directory: ./front
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v6
# This is required because GHA doesn't support secrets in the `if` condition
- name: Check if Expo build is enabled

View File

@@ -13,7 +13,7 @@ jobs:
working-directory: ./front
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v6
# This is required because GHA doesn't support secrets in the `if` condition
- name: Check if Expo build is enabled

View File

@@ -2,7 +2,7 @@ name: Release
on:
push:
tags:
- v*
- v*
jobs:
update:
@@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v6
- name: Set correct versions
run: |

3
.gitignore vendored
View File

@@ -1,4 +1,5 @@
/video
.devspace/
.env
.venv
.idea
@@ -6,6 +7,4 @@
log.html
output.xml
report.html
chart/charts
chart/Chart.lock
tmp

View File

@@ -20,7 +20,7 @@
"sharp": "^0.34.4",
},
"devDependencies": {
"@biomejs/biome": "2.3.5",
"@biomejs/biome": "2.3.7",
"@types/pg": "^8.15.5",
},
},
@@ -29,23 +29,23 @@
"drizzle-orm@0.44.7": "patches/drizzle-orm@0.44.7.patch",
},
"packages": {
"@biomejs/biome": ["@biomejs/biome@2.3.5", "", { "optionalDependencies": { "@biomejs/cli-darwin-arm64": "2.3.5", "@biomejs/cli-darwin-x64": "2.3.5", "@biomejs/cli-linux-arm64": "2.3.5", "@biomejs/cli-linux-arm64-musl": "2.3.5", "@biomejs/cli-linux-x64": "2.3.5", "@biomejs/cli-linux-x64-musl": "2.3.5", "@biomejs/cli-win32-arm64": "2.3.5", "@biomejs/cli-win32-x64": "2.3.5" }, "bin": { "biome": "bin/biome" } }, "sha512-HvLhNlIlBIbAV77VysRIBEwp55oM/QAjQEin74QQX9Xb259/XP/D5AGGnZMOyF1el4zcvlNYYR3AyTMUV3ILhg=="],
"@biomejs/biome": ["@biomejs/biome@2.3.7", "", { "optionalDependencies": { "@biomejs/cli-darwin-arm64": "2.3.7", "@biomejs/cli-darwin-x64": "2.3.7", "@biomejs/cli-linux-arm64": "2.3.7", "@biomejs/cli-linux-arm64-musl": "2.3.7", "@biomejs/cli-linux-x64": "2.3.7", "@biomejs/cli-linux-x64-musl": "2.3.7", "@biomejs/cli-win32-arm64": "2.3.7", "@biomejs/cli-win32-x64": "2.3.7" }, "bin": { "biome": "bin/biome" } }, "sha512-CTbAS/jNAiUc6rcq94BrTB8z83O9+BsgWj2sBCQg9rD6Wkh2gjfR87usjx0Ncx0zGXP1NKgT7JNglay5Zfs9jw=="],
"@biomejs/cli-darwin-arm64": ["@biomejs/cli-darwin-arm64@2.3.5", "", { "os": "darwin", "cpu": "arm64" }, "sha512-fLdTur8cJU33HxHUUsii3GLx/TR0BsfQx8FkeqIiW33cGMtUD56fAtrh+2Fx1uhiCsVZlFh6iLKUU3pniZREQw=="],
"@biomejs/cli-darwin-arm64": ["@biomejs/cli-darwin-arm64@2.3.7", "", { "os": "darwin", "cpu": "arm64" }, "sha512-LirkamEwzIUULhXcf2D5b+NatXKeqhOwilM+5eRkbrnr6daKz9rsBL0kNZ16Hcy4b8RFq22SG4tcLwM+yx/wFA=="],
"@biomejs/cli-darwin-x64": ["@biomejs/cli-darwin-x64@2.3.5", "", { "os": "darwin", "cpu": "x64" }, "sha512-qpT8XDqeUlzrOW8zb4k3tjhT7rmvVRumhi2657I2aGcY4B+Ft5fNwDdZGACzn8zj7/K1fdWjgwYE3i2mSZ+vOA=="],
"@biomejs/cli-darwin-x64": ["@biomejs/cli-darwin-x64@2.3.7", "", { "os": "darwin", "cpu": "x64" }, "sha512-Q4TO633kvrMQkKIV7wmf8HXwF0dhdTD9S458LGE24TYgBjSRbuhvio4D5eOQzirEYg6eqxfs53ga/rbdd8nBKg=="],
"@biomejs/cli-linux-arm64": ["@biomejs/cli-linux-arm64@2.3.5", "", { "os": "linux", "cpu": "arm64" }, "sha512-u/pybjTBPGBHB66ku4pK1gj+Dxgx7/+Z0jAriZISPX1ocTO8aHh8x8e7Kb1rB4Ms0nA/SzjtNOVJ4exVavQBCw=="],
"@biomejs/cli-linux-arm64": ["@biomejs/cli-linux-arm64@2.3.7", "", { "os": "linux", "cpu": "arm64" }, "sha512-inHOTdlstUBzgjDcx0ge71U4SVTbwAljmkfi3MC5WzsYCRhancqfeL+sa4Ke6v2ND53WIwCFD5hGsYExoI3EZQ=="],
"@biomejs/cli-linux-arm64-musl": ["@biomejs/cli-linux-arm64-musl@2.3.5", "", { "os": "linux", "cpu": "arm64" }, "sha512-eGUG7+hcLgGnMNl1KHVZUYxahYAhC462jF/wQolqu4qso2MSk32Q+QrpN7eN4jAHAg7FUMIo897muIhK4hXhqg=="],
"@biomejs/cli-linux-arm64-musl": ["@biomejs/cli-linux-arm64-musl@2.3.7", "", { "os": "linux", "cpu": "arm64" }, "sha512-/afy8lto4CB8scWfMdt+NoCZtatBUF62Tk3ilWH2w8ENd5spLhM77zKlFZEvsKJv9AFNHknMl03zO67CiklL2Q=="],
"@biomejs/cli-linux-x64": ["@biomejs/cli-linux-x64@2.3.5", "", { "os": "linux", "cpu": "x64" }, "sha512-XrIVi9YAW6ye0CGQ+yax0gLfx+BFOtKaNX74n+xHWla6Cl6huUmcKNO7HPx7BiKnJUzrxXY1qYlm7xMvi08X4g=="],
"@biomejs/cli-linux-x64": ["@biomejs/cli-linux-x64@2.3.7", "", { "os": "linux", "cpu": "x64" }, "sha512-fJMc3ZEuo/NaMYo5rvoWjdSS5/uVSW+HPRQujucpZqm2ZCq71b8MKJ9U4th9yrv2L5+5NjPF0nqqILCl8HY/fg=="],
"@biomejs/cli-linux-x64-musl": ["@biomejs/cli-linux-x64-musl@2.3.5", "", { "os": "linux", "cpu": "x64" }, "sha512-awVuycTPpVTH/+WDVnEEYSf6nbCBHf/4wB3lquwT7puhNg8R4XvonWNZzUsfHZrCkjkLhFH/vCZK5jHatD9FEg=="],
"@biomejs/cli-linux-x64-musl": ["@biomejs/cli-linux-x64-musl@2.3.7", "", { "os": "linux", "cpu": "x64" }, "sha512-CQUtgH1tIN6e5wiYSJqzSwJumHYolNtaj1dwZGCnZXm2PZU1jOJof9TsyiP3bXNDb+VOR7oo7ZvY01If0W3iFQ=="],
"@biomejs/cli-win32-arm64": ["@biomejs/cli-win32-arm64@2.3.5", "", { "os": "win32", "cpu": "arm64" }, "sha512-DlBiMlBZZ9eIq4H7RimDSGsYcOtfOIfZOaI5CqsWiSlbTfqbPVfWtCf92wNzx8GNMbu1s7/g3ZZESr6+GwM/SA=="],
"@biomejs/cli-win32-arm64": ["@biomejs/cli-win32-arm64@2.3.7", "", { "os": "win32", "cpu": "arm64" }, "sha512-aJAE8eCNyRpcfx2JJAtsPtISnELJ0H4xVVSwnxm13bzI8RwbXMyVtxy2r5DV1xT3WiSP+7LxORcApWw0LM8HiA=="],
"@biomejs/cli-win32-x64": ["@biomejs/cli-win32-x64@2.3.5", "", { "os": "win32", "cpu": "x64" }, "sha512-nUmR8gb6yvrKhtRgzwo/gDimPwnO5a4sCydf8ZS2kHIJhEmSmk+STsusr1LHTuM//wXppBawvSQi2xFXJCdgKQ=="],
"@biomejs/cli-win32-x64": ["@biomejs/cli-win32-x64@2.3.7", "", { "os": "win32", "cpu": "x64" }, "sha512-pulzUshqv9Ed//MiE8MOUeeEkbkSHVDVY5Cz5wVAnH1DUqliCQG3j6s1POaITTFqFfo7AVIx2sWdKpx/GS+Nqw=="],
"@drizzle-team/brocli": ["@drizzle-team/brocli@0.10.2", "", {}, "sha512-z33Il7l5dKjUgGULTqBsQBQwckHh5AbIuxhdsIxDDiZAzBOrZO6q9ogcWC65kU382AfynTfgNumVcNIjuIua6w=="],

23
api/devspace.yaml Normal file
View File

@@ -0,0 +1,23 @@
version: v2beta1
name: api
dev:
api:
imageSelector: ghcr.io/zoriya/kyoo_api
devImage: docker.io/oven/bun:latest
workingDir: /app
sync:
- path: .:/app
excludePaths:
- node_modules
startContainer: true
onUpload:
exec:
- command: bun install --frozen-lockfile
onChange:
- "./bun.lock"
command:
- bash
- -c
- "bun install && bun dev"
ports:
- port: "3567"

View File

@@ -24,7 +24,7 @@
"sharp": "^0.34.4"
},
"devDependencies": {
"@biomejs/biome": "2.3.5",
"@biomejs/biome": "2.3.7",
"@types/pg": "^8.15.5"
},
"module": "src/index.js",

View File

@@ -4,7 +4,7 @@ pkgs.mkShell {
bun
biome
# for psql to debug from the cli
postgresql_15
postgresql_18
# to build libvips (for sharp)
nodejs
node-gyp
@@ -13,4 +13,7 @@ pkgs.mkShell {
];
SHARP_FORCE_GLOBAL_LIBVIPS = 1;
shellHook = ''
export LD_LIBRARY_PATH=${pkgs.stdenv.cc.cc.lib}/lib:$LD_LIBRARY_PATH
'';
}

View File

@@ -52,8 +52,7 @@ export const base = new Elysia({ name: "base" })
console.error(code, error);
return {
status: 500,
message: "message" in error ? (error?.message ?? code) : code,
details: error,
message: "Internal server error",
} as KError;
})
.get("/health", () => ({ status: "healthy" }) as const, {

View File

@@ -1,5 +1,5 @@
import path from "node:path";
import { getCurrentSpan, record, setAttributes } from "@elysiajs/opentelemetry";
import { getCurrentSpan, setAttributes } from "@elysiajs/opentelemetry";
import { SpanStatusCode } from "@opentelemetry/api";
import { encode } from "blurhash";
import { and, eq, is, lt, type SQL, sql } from "drizzle-orm";
@@ -9,7 +9,9 @@ import type { PoolClient } from "pg";
import sharp from "sharp";
import { db, type Transaction } from "~/db";
import { mqueue } from "~/db/schema/mqueue";
import { unnestValues } from "~/db/utils";
import type { Image } from "~/models/utils";
import { record } from "~/otel";
import { getFile } from "~/utils";
export const imageDir = process.env.IMAGES_PATH ?? "/images";
@@ -28,8 +30,8 @@ export type ImageTask = {
export const enqueueOptImage = (
imgQueue: ImageTask[],
img:
| { url: string | null; column: PgColumn }
| { url: string | null; table: PgTable; column: SQL },
| { url?: string | null; column: PgColumn }
| { url?: string | null; table: PgTable; column: SQL },
): Image | null => {
if (!img.url) return null;
@@ -76,91 +78,87 @@ export const enqueueOptImage = (
};
};
export const flushImageQueue = async (
tx: Transaction,
imgQueue: ImageTask[],
priority: number,
) => {
if (!imgQueue.length) return;
record("enqueue images", async () => {
await tx
.insert(mqueue)
.values(imgQueue.map((x) => ({ kind: "image", message: x, priority })));
export const flushImageQueue = record(
"enqueueImages",
async (tx: Transaction, imgQueue: ImageTask[], priority: number) => {
if (!imgQueue.length) return;
await tx.insert(mqueue).select(
unnestValues(
imgQueue.map((x) => ({ kind: "image", message: x, priority })),
mqueue,
),
);
await tx.execute(sql`notify kyoo_image`);
});
};
},
);
export const processImages = async () => {
return record("download images", async () => {
let running = false;
async function processAll() {
if (running) return;
running = true;
export const processImages = record("processImages", async () => {
let running = false;
async function processAll() {
if (running) return;
running = true;
let found = true;
while (found) {
found = await processOne();
}
running = false;
let found = true;
while (found) {
found = await processOne();
}
running = false;
}
const client = (await db.$client.connect()) as PoolClient;
client.on("notification", (evt) => {
if (evt.channel !== "kyoo_image") return;
processAll();
});
await client.query("listen kyoo_image");
// start processing old tasks
await processAll();
return () => client.release(true);
const client = (await db.$client.connect()) as PoolClient;
client.on("notification", (evt) => {
if (evt.channel !== "kyoo_image") return;
processAll();
});
};
await client.query("listen kyoo_image");
async function processOne() {
return record("download", async () => {
return await db.transaction(async (tx) => {
const [item] = await tx
.select()
.from(mqueue)
.for("update", { skipLocked: true })
.where(and(eq(mqueue.kind, "image"), lt(mqueue.attempt, 5)))
.orderBy(mqueue.priority, mqueue.attempt, mqueue.createdAt)
.limit(1);
// start processing old tasks
await processAll();
return () => client.release(true);
});
if (!item) return false;
const processOne = record("download", async () => {
return await db.transaction(async (tx) => {
const [item] = await tx
.select()
.from(mqueue)
.for("update", { skipLocked: true })
.where(and(eq(mqueue.kind, "image"), lt(mqueue.attempt, 5)))
.orderBy(mqueue.priority, mqueue.attempt, mqueue.createdAt)
.limit(1);
const img = item.message as ImageTask;
setAttributes({ "item.url": img.url });
try {
const blurhash = await downloadImage(img.id, img.url);
const ret: Image = { id: img.id, source: img.url, blurhash };
if (!item) return false;
const table = sql.raw(img.table);
const column = sql.raw(img.column);
const img = item.message as ImageTask;
setAttributes({ "item.url": img.url });
try {
const blurhash = await downloadImage(img.id, img.url);
const ret: Image = { id: img.id, source: img.url, blurhash };
await tx.execute(sql`
update ${table} set ${column} = ${ret}
where ${column}->'id' = ${sql.raw(`'"${img.id}"'::jsonb`)}
`);
const table = sql.raw(img.table);
const column = sql.raw(img.column);
await tx.delete(mqueue).where(eq(mqueue.id, item.id));
} catch (err: any) {
const span = getCurrentSpan();
if (span) {
span.recordException(err);
span.setStatus({ code: SpanStatusCode.ERROR });
}
console.error("Failed to download image", img.url, err.message);
await tx
.update(mqueue)
.set({ attempt: sql`${mqueue.attempt}+1` })
.where(eq(mqueue.id, item.id));
await tx.execute(sql`
update ${table} set ${column} = ${ret}
where ${column}->'id' = ${sql.raw(`'"${img.id}"'::jsonb`)}
`);
await tx.delete(mqueue).where(eq(mqueue.id, item.id));
} catch (err: any) {
const span = getCurrentSpan();
if (span) {
span.recordException(err);
span.setStatus({ code: SpanStatusCode.ERROR });
}
return true;
});
console.error("Failed to download image", img.url, err.message);
await tx
.update(mqueue)
.set({ attempt: sql`${mqueue.attempt}+1` })
.where(eq(mqueue.id, item.id));
}
return true;
});
}
});
async function downloadImage(id: string, url: string): Promise<string> {
const low = await getFile(path.join(imageDir, `${id}.low.jpg`))

View File

@@ -5,81 +5,89 @@ import { conflictUpdateAllExcept } from "~/db/utils";
import type { SeedCollection } from "~/models/collections";
import type { SeedMovie } from "~/models/movie";
import type { SeedSerie } from "~/models/serie";
import { record } from "~/otel";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
type ShowTrans = typeof showTranslations.$inferInsert;
export const insertCollection = async (
collection: SeedCollection | undefined,
show: (({ kind: "movie" } & SeedMovie) | ({ kind: "serie" } & SeedSerie)) & {
nextRefresh: string;
export const insertCollection = record(
"insertCollection",
async (
collection: SeedCollection | undefined,
show: (
| ({ kind: "movie" } & SeedMovie)
| ({ kind: "serie" } & SeedSerie)
) & {
nextRefresh: string;
},
) => {
if (!collection) return null;
const { translations, ...col } = collection;
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const [ret] = await tx
.insert(shows)
.values({
kind: "collection",
status: "unknown",
startAir: show.kind === "movie" ? show.airDate : show.startAir,
endAir: show.kind === "movie" ? show.airDate : show.endAir,
nextRefresh: show.nextRefresh,
entriesCount: 0,
original: {} as any,
...col,
})
.onConflictDoUpdate({
target: shows.slug,
set: {
...conflictUpdateAllExcept(shows, [
"pk",
"id",
"slug",
"createdAt",
"startAir",
"endAir",
]),
startAir: sql`least(${shows.startAir}, excluded.start_air)`,
endAir: sql`greatest(${shows.endAir}, excluded.end_air)`,
},
})
.returning({ pk: shows.pk, id: shows.id, slug: shows.slug });
const trans: ShowTrans[] = Object.entries(translations).map(
([lang, tr]) => ({
pk: ret.pk,
language: lang,
...tr,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: showTranslations.poster,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: showTranslations.thumbnail,
}),
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: showTranslations.logo,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: showTranslations.banner,
}),
}),
);
await flushImageQueue(tx, imgQueue, 100);
// we can't unnest values here because show translations contains arrays.
await tx
.insert(showTranslations)
.values(trans)
.onConflictDoUpdate({
target: [showTranslations.pk, showTranslations.language],
set: conflictUpdateAllExcept(showTranslations, ["pk", "language"]),
});
return ret;
});
},
) => {
if (!collection) return null;
const { translations, ...col } = collection;
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const [ret] = await tx
.insert(shows)
.values({
kind: "collection",
status: "unknown",
startAir: show.kind === "movie" ? show.airDate : show.startAir,
endAir: show.kind === "movie" ? show.airDate : show.endAir,
nextRefresh: show.nextRefresh,
entriesCount: 0,
original: {} as any,
...col,
})
.onConflictDoUpdate({
target: shows.slug,
set: {
...conflictUpdateAllExcept(shows, [
"pk",
"id",
"slug",
"createdAt",
"startAir",
"endAir",
]),
startAir: sql`least(${shows.startAir}, excluded.start_air)`,
endAir: sql`greatest(${shows.endAir}, excluded.end_air)`,
},
})
.returning({ pk: shows.pk, id: shows.id, slug: shows.slug });
const trans: ShowTrans[] = Object.entries(translations).map(
([lang, tr]) => ({
pk: ret.pk,
language: lang,
...tr,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: showTranslations.poster,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: showTranslations.thumbnail,
}),
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: showTranslations.logo,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: showTranslations.banner,
}),
}),
);
await flushImageQueue(tx, imgQueue, 100);
await tx
.insert(showTranslations)
.values(trans)
.onConflictDoUpdate({
target: [showTranslations.pk, showTranslations.language],
set: conflictUpdateAllExcept(showTranslations, ["pk", "language"]),
});
return ret;
});
};
);

View File

@@ -6,8 +6,9 @@ import {
entryVideoJoin,
videos,
} from "~/db/schema";
import { conflictUpdateAllExcept, values } from "~/db/utils";
import { conflictUpdateAllExcept, unnest, unnestValues } from "~/db/utils";
import type { SeedEntry as SEntry, SeedExtra as SExtra } from "~/models/entry";
import { record } from "~/otel";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
import { guessNextRefresh } from "../refresh";
import { updateAvailableCount, updateAvailableSince } from "./shows";
@@ -42,160 +43,164 @@ const generateSlug = (
}
};
export const insertEntries = async (
show: { pk: number; slug: string; kind: "movie" | "serie" | "collection" },
items: (SeedEntry | SeedExtra)[],
onlyExtras = false,
) => {
if (!items.length) return [];
export const insertEntries = record(
"insertEntries",
async (
show: { pk: number; slug: string; kind: "movie" | "serie" | "collection" },
items: (SeedEntry | SeedExtra)[],
onlyExtras = false,
) => {
if (!items.length) return [];
const retEntries = await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const vals: EntryI[] = items.map((seed) => {
const { translations, videos, video, ...entry } = seed;
return {
...entry,
showPk: show.pk,
slug: generateSlug(show.slug, seed),
thumbnail: enqueueOptImage(imgQueue, {
url: seed.thumbnail,
column: entries.thumbnail,
}),
nextRefresh:
entry.kind !== "extra"
? guessNextRefresh(entry.airDate ?? new Date())
: guessNextRefresh(new Date()),
episodeNumber:
entry.kind === "episode"
? entry.episodeNumber
: entry.kind === "special"
? entry.number
: undefined,
};
});
const ret = await tx
.insert(entries)
.values(vals)
.onConflictDoUpdate({
target: entries.slug,
set: conflictUpdateAllExcept(entries, [
"pk",
"showPk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: entries.pk, id: entries.id, slug: entries.slug });
const trans: EntryTransI[] = items.flatMap((seed, i) => {
if (seed.kind === "extra") {
return [
{
pk: ret[i].pk,
// yeah we hardcode the language to extra because if we want to support
// translations one day it won't be awkward
language: "extra",
name: seed.name,
description: null,
poster: undefined,
},
];
}
return Object.entries(seed.translations).map(([lang, tr]) => ({
// assumes ret is ordered like items.
pk: ret[i].pk,
language: lang,
...tr,
poster:
seed.kind === "movie"
? enqueueOptImage(imgQueue, {
url: (tr as any).poster,
column: entryTranslations.poster,
})
: undefined,
}));
});
await flushImageQueue(tx, imgQueue, 0);
await tx
.insert(entryTranslations)
.values(trans)
.onConflictDoUpdate({
target: [entryTranslations.pk, entryTranslations.language],
set: conflictUpdateAllExcept(entryTranslations, ["pk", "language"]),
const retEntries = await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const vals: EntryI[] = items.map((seed) => {
const { translations, videos, video, ...entry } = seed;
return {
...entry,
showPk: show.pk,
slug: generateSlug(show.slug, seed),
thumbnail: enqueueOptImage(imgQueue, {
url: seed.thumbnail,
column: entries.thumbnail,
}),
nextRefresh:
entry.kind !== "extra"
? guessNextRefresh(entry.airDate ?? new Date())
: guessNextRefresh(new Date()),
episodeNumber:
entry.kind === "episode"
? entry.episodeNumber
: entry.kind === "special"
? entry.number
: undefined,
};
});
const ret = await tx
.insert(entries)
.select(unnestValues(vals, entries))
.onConflictDoUpdate({
target: entries.slug,
set: conflictUpdateAllExcept(entries, [
"pk",
"showPk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: entries.pk, id: entries.id, slug: entries.slug });
return ret;
});
const trans: EntryTransI[] = items.flatMap((seed, i) => {
if (seed.kind === "extra") {
return [
{
pk: ret[i].pk,
// yeah we hardcode the language to extra because if we want to support
// translations one day it won't be awkward
language: "extra",
name: seed.name,
description: null,
poster: undefined,
},
];
}
const vids = items.flatMap((seed, i) => {
if (seed.kind === "extra") {
return {
videoId: seed.video,
return Object.entries(seed.translations).map(([lang, tr]) => ({
// assumes ret is ordered like items.
pk: ret[i].pk,
language: lang,
...tr,
poster:
seed.kind === "movie"
? enqueueOptImage(imgQueue, {
url: (tr as any).poster,
column: entryTranslations.poster,
})
: undefined,
}));
});
await flushImageQueue(tx, imgQueue, 0);
await tx
.insert(entryTranslations)
.select(unnestValues(trans, entryTranslations))
.onConflictDoUpdate({
target: [entryTranslations.pk, entryTranslations.language],
set: conflictUpdateAllExcept(entryTranslations, ["pk", "language"]),
});
return ret;
});
const vids = items.flatMap((seed, i) => {
if (seed.kind === "extra") {
return {
videoId: seed.video,
entryPk: retEntries[i].pk,
entrySlug: retEntries[i].slug,
needRendering: false,
};
}
if (!seed.videos) return [];
return seed.videos.map((x, j) => ({
videoId: x,
entryPk: retEntries[i].pk,
entrySlug: retEntries[i].slug,
needRendering: false,
};
// The first video should not have a rendering.
needRendering: j !== 0 && seed.videos!.length > 1,
}));
});
if (vids.length === 0) {
// we have not added videos but we need to update the `entriesCount`
if (show.kind === "serie" && !onlyExtras)
await updateAvailableCount(db, [show.pk], true);
return retEntries.map((x) => ({ id: x.id, slug: x.slug, videos: [] }));
}
if (!seed.videos) return [];
return seed.videos.map((x, j) => ({
videoId: x,
entryPk: retEntries[i].pk,
entrySlug: retEntries[i].slug,
// The first video should not have a rendering.
needRendering: j !== 0 && seed.videos!.length > 1,
const retVideos = await db.transaction(async (tx) => {
const ret = await tx
.insert(entryVideoJoin)
.select(
db
.select({
entryPk: sql<number>`vids."entryPk"`.as("entry"),
videoPk: videos.pk,
slug: computeVideoSlug(
sql`vids."entrySlug"`,
sql`vids."needRendering"`,
),
})
.from(
unnest(vids, "vids", {
entryPk: "integer",
entrySlug: "varchar(255)",
needRendering: "boolean",
videoId: "uuid",
}),
)
.innerJoin(videos, eq(videos.id, sql`vids."videoId"`)),
)
.onConflictDoNothing()
.returning({
slug: entryVideoJoin.slug,
entryPk: entryVideoJoin.entryPk,
});
if (!onlyExtras)
await updateAvailableCount(tx, [show.pk], show.kind === "serie");
await updateAvailableSince(tx, [...new Set(vids.map((x) => x.entryPk))]);
return ret;
});
return retEntries.map((entry) => ({
id: entry.id,
slug: entry.slug,
videos: retVideos.filter((x) => x.entryPk === entry.pk),
}));
});
if (vids.length === 0) {
// we have not added videos but we need to update the `entriesCount`
if (show.kind === "serie" && !onlyExtras)
await updateAvailableCount(db, [show.pk], true);
return retEntries.map((x) => ({ id: x.id, slug: x.slug, videos: [] }));
}
const retVideos = await db.transaction(async (tx) => {
const ret = await tx
.insert(entryVideoJoin)
.select(
db
.select({
entryPk: sql<number>`vids.entryPk`.as("entry"),
videoPk: videos.pk,
slug: computeVideoSlug(
sql`vids.entrySlug`,
sql`vids.needRendering`,
),
})
.from(
values(vids, {
entryPk: "integer",
needRendering: "boolean",
videoId: "uuid",
}).as("vids"),
)
.innerJoin(videos, eq(videos.id, sql`vids.videoId`)),
)
.onConflictDoNothing()
.returning({
slug: entryVideoJoin.slug,
entryPk: entryVideoJoin.entryPk,
});
if (!onlyExtras)
await updateAvailableCount(tx, [show.pk], show.kind === "serie");
await updateAvailableSince(tx, [...new Set(vids.map((x) => x.entryPk))]);
return ret;
});
return retEntries.map((entry) => ({
id: entry.id,
slug: entry.slug,
videos: retVideos.filter((x) => x.entryPk === entry.pk),
}));
};
},
);
export function computeVideoSlug(entrySlug: SQL | Column, needsRendering: SQL) {
return sql<string>`

View File

@@ -1,77 +1,78 @@
import { db } from "~/db";
import { seasons, seasonTranslations } from "~/db/schema";
import { conflictUpdateAllExcept } from "~/db/utils";
import { conflictUpdateAllExcept, unnestValues } from "~/db/utils";
import type { SeedSeason } from "~/models/season";
import { record } from "~/otel";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
import { guessNextRefresh } from "../refresh";
type SeasonI = typeof seasons.$inferInsert;
type SeasonTransI = typeof seasonTranslations.$inferInsert;
export const insertSeasons = async (
show: { pk: number; slug: string },
items: SeedSeason[],
) => {
if (!items.length) return [];
export const insertSeasons = record(
"insertSeasons",
async (show: { pk: number; slug: string }, items: SeedSeason[]) => {
if (!items.length) return [];
return db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const vals: SeasonI[] = items.map((x) => {
const { translations, ...season } = x;
return {
...season,
showPk: show.pk,
slug:
season.seasonNumber === 0
? `${show.slug}-specials`
: `${show.slug}-s${season.seasonNumber}`,
nextRefresh: guessNextRefresh(season.startAir ?? new Date()),
};
});
const ret = await tx
.insert(seasons)
.values(vals)
.onConflictDoUpdate({
target: seasons.slug,
set: conflictUpdateAllExcept(seasons, [
"pk",
"showPk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: seasons.pk, id: seasons.id, slug: seasons.slug });
const trans: SeasonTransI[] = items.flatMap((seed, i) =>
Object.entries(seed.translations).map(([lang, tr]) => ({
// assumes ret is ordered like items.
pk: ret[i].pk,
language: lang,
...tr,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: seasonTranslations.poster,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: seasonTranslations.thumbnail,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: seasonTranslations.banner,
}),
})),
);
await flushImageQueue(tx, imgQueue, -10);
await tx
.insert(seasonTranslations)
.values(trans)
.onConflictDoUpdate({
target: [seasonTranslations.pk, seasonTranslations.language],
set: conflictUpdateAllExcept(seasonTranslations, ["pk", "language"]),
return db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const vals: SeasonI[] = items.map((x) => {
const { translations, ...season } = x;
return {
...season,
showPk: show.pk,
slug:
season.seasonNumber === 0
? `${show.slug}-specials`
: `${show.slug}-s${season.seasonNumber}`,
nextRefresh: guessNextRefresh(season.startAir ?? new Date()),
};
});
const ret = await tx
.insert(seasons)
.select(unnestValues(vals, seasons))
.onConflictDoUpdate({
target: seasons.slug,
set: conflictUpdateAllExcept(seasons, [
"pk",
"showPk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: seasons.pk, id: seasons.id, slug: seasons.slug });
return ret;
});
};
const trans: SeasonTransI[] = items.flatMap((seed, i) =>
Object.entries(seed.translations).map(([lang, tr]) => ({
// assumes ret is ordered like items.
pk: ret[i].pk,
language: lang,
...tr,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: seasonTranslations.poster,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: seasonTranslations.thumbnail,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: seasonTranslations.banner,
}),
})),
);
await flushImageQueue(tx, imgQueue, -10);
await tx
.insert(seasonTranslations)
.select(unnestValues(trans, seasonTranslations))
.onConflictDoUpdate({
target: [seasonTranslations.pk, seasonTranslations.language],
set: conflictUpdateAllExcept(seasonTranslations, ["pk", "language"]),
});
return ret;
});
},
);

View File

@@ -21,88 +21,93 @@ import type { SeedCollection } from "~/models/collections";
import type { SeedMovie } from "~/models/movie";
import type { SeedSerie } from "~/models/serie";
import type { Original } from "~/models/utils";
import { record } from "~/otel";
import { getYear } from "~/utils";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
type Show = typeof shows.$inferInsert;
type ShowTrans = typeof showTranslations.$inferInsert;
export const insertShow = async (
show: Omit<Show, "original">,
original: Original & {
poster: string | null;
thumbnail: string | null;
banner: string | null;
logo: string | null;
},
translations:
| SeedMovie["translations"]
| SeedSerie["translations"]
| SeedCollection["translations"],
) => {
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const orig = {
...original,
poster: enqueueOptImage(imgQueue, {
url: original.poster,
table: shows,
column: sql`${shows.original}['poster']`,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: original.thumbnail,
table: shows,
column: sql`${shows.original}['thumbnail']`,
}),
banner: enqueueOptImage(imgQueue, {
url: original.banner,
table: shows,
column: sql`${shows.original}['banner']`,
}),
logo: enqueueOptImage(imgQueue, {
url: original.logo,
table: shows,
column: sql`${shows.original}['logo']`,
}),
};
const ret = await insertBaseShow(tx, { ...show, original: orig });
if ("status" in ret) return ret;
const trans: ShowTrans[] = Object.entries(translations).map(
([lang, tr]) => ({
pk: ret.pk,
language: lang,
...tr,
latinName: tr.latinName ?? null,
export const insertShow = record(
"insertShow",
async (
show: Omit<Show, "original">,
original: Original & {
poster?: string | null;
thumbnail?: string | null;
banner?: string | null;
logo?: string | null;
},
translations:
| SeedMovie["translations"]
| SeedSerie["translations"]
| SeedCollection["translations"],
) => {
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const orig = {
...original,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: showTranslations.poster,
url: original.poster,
table: shows,
column: sql`${shows.original}['poster']`,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: showTranslations.thumbnail,
}),
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: showTranslations.logo,
url: original.thumbnail,
table: shows,
column: sql`${shows.original}['thumbnail']`,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: showTranslations.banner,
url: original.banner,
table: shows,
column: sql`${shows.original}['banner']`,
}),
}),
);
await flushImageQueue(tx, imgQueue, 200);
await tx
.insert(showTranslations)
.values(trans)
.onConflictDoUpdate({
target: [showTranslations.pk, showTranslations.language],
set: conflictUpdateAllExcept(showTranslations, ["pk", "language"]),
});
return ret;
});
};
logo: enqueueOptImage(imgQueue, {
url: original.logo,
table: shows,
column: sql`${shows.original}['logo']`,
}),
};
const ret = await insertBaseShow(tx, { ...show, original: orig });
if ("status" in ret) return ret;
const trans: ShowTrans[] = Object.entries(translations).map(
([lang, tr]) => ({
pk: ret.pk,
language: lang,
...tr,
latinName: tr.latinName ?? null,
poster: enqueueOptImage(imgQueue, {
url: tr.poster,
column: showTranslations.poster,
}),
thumbnail: enqueueOptImage(imgQueue, {
url: tr.thumbnail,
column: showTranslations.thumbnail,
}),
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: showTranslations.logo,
}),
banner: enqueueOptImage(imgQueue, {
url: tr.banner,
column: showTranslations.banner,
}),
}),
);
await flushImageQueue(tx, imgQueue, 200);
// we can't unnest values here because show translations contains arrays.
await tx
.insert(showTranslations)
.values(trans)
.onConflictDoUpdate({
target: [showTranslations.pk, showTranslations.language],
set: conflictUpdateAllExcept(showTranslations, ["pk", "language"]),
});
return ret;
});
},
);
async function insertBaseShow(tx: Transaction, show: Show) {
function insert() {

View File

@@ -1,57 +1,67 @@
import { eq, sql } from "drizzle-orm";
import { db } from "~/db";
import { roles, staff } from "~/db/schema";
import { conflictUpdateAllExcept } from "~/db/utils";
import { conflictUpdateAllExcept, unnestValues } from "~/db/utils";
import type { SeedStaff } from "~/models/staff";
import { record } from "~/otel";
import { uniqBy } from "~/utils";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
export const insertStaff = async (
seed: SeedStaff[] | undefined,
showPk: number,
) => {
if (!seed?.length) return [];
export const insertStaff = record(
"insertStaff",
async (seed: SeedStaff[] | undefined, showPk: number) => {
if (!seed?.length) return [];
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const people = seed.map((x) => ({
...x.staff,
image: enqueueOptImage(imgQueue, {
url: x.staff.image,
column: staff.image,
}),
}));
const ret = await tx
.insert(staff)
.values(people)
.onConflictDoUpdate({
target: staff.slug,
set: conflictUpdateAllExcept(staff, ["pk", "id", "slug", "createdAt"]),
})
.returning({ pk: staff.pk, id: staff.id, slug: staff.slug });
return await db.transaction(async (tx) => {
const imgQueue: ImageTask[] = [];
const people = uniqBy(
seed.map((x) => ({
...x.staff,
image: enqueueOptImage(imgQueue, {
url: x.staff.image,
column: staff.image,
}),
})),
(x) => x.slug,
);
const ret = await tx
.insert(staff)
.select(unnestValues(people, staff))
.onConflictDoUpdate({
target: staff.slug,
set: conflictUpdateAllExcept(staff, [
"pk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: staff.pk, id: staff.id, slug: staff.slug });
const rval = seed.map((x, i) => ({
showPk,
staffPk: ret[i].pk,
kind: x.kind,
order: i,
character: {
...x.character,
image: enqueueOptImage(imgQueue, {
url: x.character.image,
table: roles,
column: sql`${roles.character}['image']`,
}),
},
}));
const rval = seed.map((x, i) => ({
showPk,
staffPk: ret.find((y) => y.slug === x.staff.slug)!.pk,
kind: x.kind,
order: i,
character: {
...x.character,
image: enqueueOptImage(imgQueue, {
url: x.character.image,
table: roles,
column: sql`${roles.character}['image']`,
}),
},
}));
await flushImageQueue(tx, imgQueue, -200);
await flushImageQueue(tx, imgQueue, -200);
// always replace all roles. this is because:
// - we want `order` to stay in sync (& without duplicates)
// - we don't have ways to identify a role so we can't onConflict
await tx.delete(roles).where(eq(roles.showPk, showPk));
await tx.insert(roles).values(rval);
// always replace all roles. this is because:
// - we want `order` to stay in sync (& without duplicates)
// - we don't have ways to identify a role so we can't onConflict
await tx.delete(roles).where(eq(roles.showPk, showPk));
await tx.insert(roles).select(unnestValues(rval, roles));
return ret;
});
};
return ret;
});
},
);

View File

@@ -1,63 +1,76 @@
import { sql } from "drizzle-orm";
import { db } from "~/db";
import { showStudioJoin, studios, studioTranslations } from "~/db/schema";
import { conflictUpdateAllExcept } from "~/db/utils";
import { conflictUpdateAllExcept, sqlarr, unnestValues } from "~/db/utils";
import type { SeedStudio } from "~/models/studio";
import { record } from "~/otel";
import { uniqBy } from "~/utils";
import { enqueueOptImage, flushImageQueue, type ImageTask } from "../images";
type StudioI = typeof studios.$inferInsert;
type StudioTransI = typeof studioTranslations.$inferInsert;
export const insertStudios = async (
seed: SeedStudio[] | undefined,
showPk: number,
) => {
if (!seed?.length) return [];
export const insertStudios = record(
"insertStudios",
async (seed: SeedStudio[] | undefined, showPk: number) => {
if (!seed?.length) return [];
return await db.transaction(async (tx) => {
const vals: StudioI[] = seed.map((x) => {
const { translations, ...item } = x;
return item;
});
const ret = await tx
.insert(studios)
.values(vals)
.onConflictDoUpdate({
target: studios.slug,
set: conflictUpdateAllExcept(studios, [
"pk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: studios.pk, id: studios.id, slug: studios.slug });
const imgQueue: ImageTask[] = [];
const trans: StudioTransI[] = seed.flatMap((x, i) =>
Object.entries(x.translations).map(([lang, tr]) => ({
pk: ret[i].pk,
language: lang,
name: tr.name,
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: studioTranslations.logo,
}),
})),
);
await flushImageQueue(tx, imgQueue, -100);
await tx
.insert(studioTranslations)
.values(trans)
.onConflictDoUpdate({
target: [studioTranslations.pk, studioTranslations.language],
set: conflictUpdateAllExcept(studioTranslations, ["pk", "language"]),
return await db.transaction(async (tx) => {
seed = uniqBy(seed!, (x) => x.slug);
const vals: StudioI[] = seed.map((x) => {
const { translations, ...item } = x;
return item;
});
await tx
.insert(showStudioJoin)
.values(ret.map((studio) => ({ showPk: showPk, studioPk: studio.pk })))
.onConflictDoNothing();
return ret;
});
};
const ret = await tx
.insert(studios)
.select(unnestValues(vals, studios))
.onConflictDoUpdate({
target: studios.slug,
set: conflictUpdateAllExcept(studios, [
"pk",
"id",
"slug",
"createdAt",
]),
})
.returning({ pk: studios.pk, id: studios.id, slug: studios.slug });
const imgQueue: ImageTask[] = [];
const trans: StudioTransI[] = seed.flatMap((x, i) =>
Object.entries(x.translations).map(([lang, tr]) => ({
pk: ret[i].pk,
language: lang,
name: tr.name,
logo: enqueueOptImage(imgQueue, {
url: tr.logo,
column: studioTranslations.logo,
}),
})),
);
await flushImageQueue(tx, imgQueue, -100);
await tx
.insert(studioTranslations)
.select(unnestValues(trans, studioTranslations))
.onConflictDoUpdate({
target: [studioTranslations.pk, studioTranslations.language],
set: conflictUpdateAllExcept(studioTranslations, ["pk", "language"]),
});
await tx
.insert(showStudioJoin)
.select(
db
.select({
showPk: sql`${showPk}`.as("showPk"),
studioPk: sql`v."studioPk"`.as("studioPk"),
})
.from(
sql`unnest(${sqlarr(ret.map((x) => x.pk))}::integer[]) as v("studioPk")`,
),
)
.onConflictDoNothing();
return ret;
});
},
);

View File

@@ -55,20 +55,13 @@ export const seedMovie = async (
const { translations, videos, collection, studios, staff, ...movie } = seed;
const nextRefresh = guessNextRefresh(movie.airDate ?? new Date());
const original = translations[movie.originalLanguage];
if (!original) {
return {
status: 422,
message: "No translation available in the original language.",
};
}
const col = await insertCollection(collection, {
kind: "movie",
nextRefresh,
...seed,
});
const original = translations[movie.originalLanguage];
const show = await insertShow(
{
kind: "movie",
@@ -78,11 +71,17 @@ export const seedMovie = async (
entriesCount: 1,
...movie,
},
{
...original,
latinName: original.latinName ?? null,
language: movie.originalLanguage,
},
original
? {
...original,
latinName: original.latinName ?? null,
language: movie.originalLanguage,
}
: {
name: null,
latinName: null,
language: movie.originalLanguage,
},
translations,
);
if ("status" in show) return show;

View File

@@ -91,20 +91,13 @@ export const seedSerie = async (
} = seed;
const nextRefresh = guessNextRefresh(serie.startAir ?? new Date());
const original = translations[serie.originalLanguage];
if (!original) {
return {
status: 422,
message: "No translation available in the original language.",
};
}
const col = await insertCollection(collection, {
kind: "serie",
nextRefresh,
...seed,
});
const original = translations[serie.originalLanguage];
const show = await insertShow(
{
kind: "serie",
@@ -113,11 +106,17 @@ export const seedSerie = async (
entriesCount: entries.length,
...serie,
},
{
...original,
latinName: original.latinName ?? null,
language: serie.originalLanguage,
},
original
? {
...original,
latinName: original.latinName ?? null,
language: serie.originalLanguage,
}
: {
name: null,
latinName: null,
language: serie.originalLanguage,
},
translations,
);
if ("status" in show) return show;

View File

@@ -35,7 +35,8 @@ import {
jsonbBuildObject,
jsonbObjectAgg,
sqlarr,
values,
unnest,
unnestValues,
} from "~/db/utils";
import { Entry } from "~/models/entry";
import { KError } from "~/models/error";
@@ -129,10 +130,10 @@ async function linkVideos(
slug: computeVideoSlug(entriesQ.slug, hasRenderingQ),
})
.from(
values(links, {
unnest(links, "j", {
video: "integer",
entry: "jsonb",
}).as("j"),
}),
)
.innerJoin(videos, eq(videos.pk, sql`j.video`))
.innerJoin(
@@ -830,12 +831,15 @@ export const videosWriteH = new Elysia({ prefix: "/videos", tags: ["videos"] })
.post(
"",
async ({ body, status }) => {
if (body.length === 0) {
return status(422, { status: 422, message: "No videos" });
}
return await db.transaction(async (tx) => {
let vids: { pk: number; id: string; path: string; guess: Guess }[] = [];
try {
vids = await tx
.insert(videos)
.values(body)
.select(unnestValues(body, videos))
.onConflictDoUpdate({
target: [videos.path],
set: conflictUpdateAllExcept(videos, ["pk", "id", "createdAt"]),
@@ -924,6 +928,7 @@ export const videosWriteH = new Elysia({ prefix: "/videos", tags: ["videos"] })
description:
"Invalid rendering specified. (conflicts with an existing video)",
},
422: KError,
},
},
)

View File

@@ -6,6 +6,7 @@ import { sql } from "drizzle-orm";
import { drizzle } from "drizzle-orm/node-postgres";
import { migrate as migrateDb } from "drizzle-orm/node-postgres/migrator";
import type { PoolConfig } from "pg";
import { record } from "~/otel";
import * as schema from "./schema";
const config: PoolConfig = {
@@ -22,8 +23,10 @@ const config: PoolConfig = {
async function parseSslConfig(): Promise<PoolConfig> {
// Due to an upstream bug, if `ssl` is not falsey, an SSL connection will always be attempted. This means
// that non-SSL connection options under `ssl` (which is incorrectly named) cannot be set unless SSL is enabled.
if (!process.env.PGSSLMODE || process.env.PGSSLMODE === "disable")
if (!process.env.PGSSLMODE || process.env.PGSSLMODE === "disable") {
config.ssl = false;
return config;
}
// Despite this field's name, it is used to configure everything below the application layer.
const ssl: ConnectionOptions = {};
@@ -112,7 +115,19 @@ const postgresConfig = await parseSslConfig();
// use this when using drizzle-kit since it can't parse await statements
// const postgresConfig = config;
console.log("Connecting to postgres with config", postgresConfig);
console.log("Connecting to postgres with config", {
...postgresConfig,
password: postgresConfig.password ? "<redacted>" : undefined,
ssl:
postgresConfig.ssl && typeof postgresConfig.ssl === "object"
? {
...postgresConfig.ssl,
key: "<redacted>",
cert: "<redacted>",
ca: "<redacted>",
}
: postgresConfig.ssl,
});
export const db = drizzle({
schema,
connection: postgresConfig,
@@ -122,24 +137,26 @@ instrumentDrizzleClient(db, {
maxQueryTextLength: 100_000_000,
});
export const migrate = async () => {
export const migrate = record("migrate", async () => {
const APP_SCHEMA = "kyoo";
try {
await db.execute(
sql.raw(`
create extension if not exists pg_trgm;
set pg_trgm.word_similarity_threshold = 0.4;
alter database "${postgresConfig.database}" set pg_trgm.word_similarity_threshold = 0.4;
`),
create schema if not exists ${APP_SCHEMA};
create extension if not exists pg_trgm schema ${APP_SCHEMA};
set pg_trgm.word_similarity_threshold = 0.4;
alter database "${postgresConfig.database}" set pg_trgm.word_similarity_threshold = 0.4;
`),
);
} catch (err: any) {
console.error("Error while updating pg_trgm", err.message);
}
await migrateDb(db, {
migrationsSchema: "kyoo",
migrationsSchema: APP_SCHEMA,
migrationsFolder: "./drizzle",
});
console.log(`Database ${postgresConfig.database} migrated!`);
};
});
export type Transaction =
| typeof db

View File

@@ -91,7 +91,7 @@ export const seasonRelations = relations(seasons, ({ one, many }) => ({
export const seasonTrRelations = relations(seasonTranslations, ({ one }) => ({
season: one(seasons, {
relationName: "season_translation",
relationName: "season_translations",
fields: [seasonTranslations.pk],
references: [seasons.pk],
}),

View File

@@ -8,12 +8,17 @@ import {
type Subquery,
sql,
Table,
type TableConfig,
View,
ViewBaseConfig,
} from "drizzle-orm";
import type { CasingCache } from "drizzle-orm/casing";
import type { AnyMySqlSelect } from "drizzle-orm/mysql-core";
import type { AnyPgSelect, SelectedFieldsFlat } from "drizzle-orm/pg-core";
import type {
AnyPgSelect,
PgTableWithColumns,
SelectedFieldsFlat,
} from "drizzle-orm/pg-core";
import type { AnySQLiteSelect } from "drizzle-orm/sqlite-core";
import type { WithSubquery } from "drizzle-orm/subquery";
import { db } from "./index";
@@ -69,8 +74,22 @@ export function conflictUpdateAllExcept<
}
// drizzle is bugged and doesn't allow js arrays to be used in raw sql.
export function sqlarr(array: unknown[]) {
return `{${array.map((item) => `"${item}"`).join(",")}}`;
export function sqlarr(array: unknown[]): string {
function escapeStr(str: string) {
return str.replaceAll("\\", "\\\\").replaceAll('"', '\\"');
}
return `{${array
.map((item) =>
item === "null" || item === null || item === undefined
? "null"
: Array.isArray(item)
? sqlarr(item)
: typeof item === "object"
? `"${escapeStr(JSON.stringify(item))}"`
: `"${escapeStr(item.toString())}"`,
)
.join(", ")}}`;
}
// See https://github.com/drizzle-team/drizzle-orm/issues/4044
@@ -103,6 +122,102 @@ export function values<K extends string>(
};
}
/* goal:
* unnestValues([{a: 1, b: 2}, {a: 3, b: 4}], tbl)
*
* ```sql
* select a, b, now() as updated_at from unnest($1::integer[], $2::integer[]);
* ```
* params:
* $1: [1, 2]
* $2: [3, 4]
*
* select
*/
export const unnestValues = <
T extends Record<string, unknown>,
C extends TableConfig = never,
>(
values: T[],
typeInfo: PgTableWithColumns<C>,
) => {
if (values[0] === undefined)
throw new Error("Invalid values, expecting at least one items");
const columns = getTableColumns(typeInfo);
const keys = Object.keys(values[0]).filter((x) => x in columns);
// @ts-expect-error: drizzle internal
const casing = db.dialect.casing as CasingCache;
const dbNames = Object.fromEntries(
Object.entries(columns).map(([k, v]) => [k, casing.getColumnCasing(v)]),
);
const vals = values.reduce(
(acc, cur, i) => {
for (const k of keys) {
if (k in cur) acc[k].push(cur[k]);
else acc[k].push(null);
}
for (const k of Object.keys(cur)) {
if (k in acc) continue;
if (!(k in columns)) continue;
keys.push(k);
acc[k] = new Array(i).fill(null);
acc[k].push(cur[k]);
}
return acc;
},
Object.fromEntries(keys.map((x) => [x, [] as unknown[]])),
);
const computed = Object.entries(columns)
.filter(([k, v]) => (v.defaultFn || v.onUpdateFn) && !keys.includes(k))
.map(([k]) => k);
return db
.select(
Object.fromEntries([
...keys.map((x) => [x, sql.raw(`"${dbNames[x]}"`)]),
...computed.map((x) => [
x,
(columns[x].defaultFn?.() ?? columns[x].onUpdateFn!()).as(dbNames[x]),
]),
]) as {
[k in keyof typeof typeInfo.$inferInsert]-?: SQL.Aliased<
(typeof typeInfo.$inferInsert)[k]
>;
},
)
.from(
sql`unnest(${sql.join(
keys.map(
(k) =>
sql`${sqlarr(vals[k])}${sql.raw(`::${columns[k].getSQLType()}[]`)}`,
),
sql.raw(", "),
)}) as v(${sql.raw(keys.map((x) => `"${dbNames[x]}"`).join(", "))})`,
);
};
export const unnest = <T extends Record<string, unknown>>(
values: T[],
name: string,
typeInfo: Record<keyof T, string>,
) => {
const keys = Object.keys(typeInfo);
const vals = values.reduce(
(acc, cur) => {
for (const k of keys) {
if (k in cur) acc[k].push(cur[k]);
else acc[k].push(null);
}
return acc;
},
Object.fromEntries(keys.map((x) => [x, [] as unknown[]])),
);
return sql`unnest(${sql.join(
keys.map((k) => sql`${sqlarr(vals[k])}${sql.raw(`::${typeInfo[k]}[]`)}`),
sql.raw(", "),
)}) as ${sql.raw(name)}(${sql.raw(keys.map((x) => `"${x}"`).join(", "))})`;
};
export const coalesce = <T>(val: SQL<T> | SQLWrapper, def: SQL<T> | Column) => {
return sql<T>`coalesce(${val}, ${def})`;
};

View File

@@ -8,7 +8,9 @@ import { comment } from "./utils";
await migrate();
// run image processor task in background
processImages();
for (let i = 0; i < 10; i++) {
processImages();
}
const app = new Elysia()
.use(

View File

@@ -7,10 +7,12 @@ export const Original = t.Object({
description: "The language code this was made in.",
examples: ["ja"],
}),
name: t.String({
description: "The name in the original language",
examples: ["進撃の巨人"],
}),
name: t.Nullable(
t.String({
description: "The name in the original language",
examples: ["進撃の巨人"],
}),
),
latinName: t.Nullable(
t.String({
description: comment`

View File

@@ -1,4 +1,4 @@
import { opentelemetry } from "@elysiajs/opentelemetry";
import { record as elysiaRecord, opentelemetry } from "@elysiajs/opentelemetry";
import { OTLPMetricExporter as GrpcMetricExporter } from "@opentelemetry/exporter-metrics-otlp-grpc";
import { OTLPMetricExporter as HttpMetricExporter } from "@opentelemetry/exporter-metrics-otlp-proto";
import { OTLPTraceExporter as GrpcTraceExporter } from "@opentelemetry/exporter-trace-otlp-grpc";
@@ -32,3 +32,12 @@ export const otel = new Elysia()
}),
)
.as("global");
export function record<T extends (...args: any) => any>(
spanName: string,
fn: T,
): T {
const wrapped = (...args: Parameters<T>) =>
elysiaRecord(spanName, () => fn(...args));
return wrapped as T;
}

View File

@@ -28,3 +28,13 @@ export function getFile(path: string): BunFile | S3File {
return Bun.file(path);
}
export function uniqBy<T>(a: T[], key: (val: T) => string): T[] {
const seen: Record<string, boolean> = {};
return a.filter((item) => {
const k = key(item);
if (seen[k]) return false;
seen[k] = true;
return true;
});
}

View File

@@ -17,6 +17,7 @@ describe("images", () => {
});
it("Create a serie download images", async () => {
await db.delete(mqueue);
await createSerie(madeInAbyss);
const release = await processImages();
// remove notifications to prevent other images to be downloaded (do not curl 20000 images for nothing)
@@ -31,12 +32,17 @@ describe("images", () => {
});
it("Download 404 image", async () => {
await db.delete(mqueue);
const url404 = "https://mockhttp.org/status/404";
const [ret, body] = await createMovie({
...dune,
translations: {
en: {
...dune.translations.en,
poster: "https://www.google.com/404",
poster: url404,
thumbnail: null,
banner: null,
logo: null,
},
},
});
@@ -49,7 +55,7 @@ describe("images", () => {
const failed = await db.query.mqueue.findFirst({
where: and(
eq(mqueue.kind, "image"),
eq(sql`${mqueue.message}->>'url'`, "https://www.google.com/404"),
eq(sql`${mqueue.message}->>'url'`, url404),
),
});
expect(failed!.attempt).toBe(5);

View File

@@ -6,9 +6,12 @@ import {
getStaffRoles,
} from "tests/helpers";
import { expectStatus } from "tests/utils";
import { db } from "~/db";
import { staff } from "~/db/schema";
import { madeInAbyss } from "~/models/examples";
beforeAll(async () => {
await db.delete(staff);
await createSerie(madeInAbyss);
});

View File

@@ -2,7 +2,7 @@ import { beforeAll, describe, expect, it } from "bun:test";
import { eq } from "drizzle-orm";
import { expectStatus } from "tests/utils";
import { db } from "~/db";
import { seasons, shows, videos } from "~/db/schema";
import { entries, seasons, shows, videos } from "~/db/schema";
import { madeInAbyss, madeInAbyssVideo } from "~/models/examples";
import { createSerie } from "../helpers";
@@ -104,4 +104,61 @@ describe("Serie seeding", () => {
],
});
});
it("Can create a serie with quotes", async () => {
await db.delete(entries);
const [resp, body] = await createSerie({
...madeInAbyss,
slug: "quote-test",
seasons: [
{
...madeInAbyss.seasons[0],
translations: {
en: {
...madeInAbyss.seasons[0].translations.en,
name: "Season'1",
},
},
},
{
...madeInAbyss.seasons[1],
translations: {
en: {
...madeInAbyss.seasons[0].translations.en,
name: 'Season"2',
description: `This's """""quote, idk'''''`,
},
},
},
],
});
expectStatus(resp, body).toBe(201);
expect(body.id).toBeString();
expect(body.slug).toBe("quote-test");
const ret = await db.query.shows.findFirst({
where: eq(shows.id, body.id),
with: {
seasons: {
orderBy: seasons.seasonNumber,
with: { translations: true },
},
entries: {
with: {
translations: true,
evj: { with: { video: true } },
},
},
},
});
expect(ret).not.toBeNull();
expect(ret!.seasons).toBeArrayOfSize(2);
expect(ret!.seasons[0].translations[0].name).toBe("Season'1");
expect(ret!.seasons[1].translations[0].name).toBe('Season"2');
expect(ret!.entries).toBeArrayOfSize(
madeInAbyss.entries.length + madeInAbyss.extras.length,
);
});
});

View File

@@ -3,6 +3,7 @@ import { beforeAll } from "bun:test";
process.env.PGDATABASE = "kyoo_test";
process.env.JWT_SECRET = "this is a secret";
process.env.JWT_ISSUER = "https://kyoo.zoriya.dev";
process.env.IMAGES_PATH = "./images";
beforeAll(async () => {
// lazy load this so env set before actually applies

View File

@@ -1,6 +1,6 @@
{
"compilerOptions": {
"target": "ES2021",
"target": "ES2022",
"module": "ES2022",
"moduleResolution": "node",
"esModuleInterop": true,

View File

@@ -1,11 +1,11 @@
FROM golang:1.25 AS build
FROM --platform=$BUILDPLATFORM golang:1.25 AS build
WORKDIR /app
COPY go.mod go.sum ./
RUN go mod download
COPY . .
RUN CGO_ENABLED=0 GOOS=linux go build -o /keibi
RUN CGO_ENABLED=0 GOOS=${TARGETOS:-linux} GOARCH=$TARGETARCH go build -o /keibi
FROM gcr.io/distroless/base-debian11
WORKDIR /app

View File

@@ -211,6 +211,7 @@ func (h *Handler) createApiJwt(apikey string) (string, error) {
Time: time.Now().UTC().Add(time.Hour),
}
jwt := jwt.NewWithClaims(jwt.SigningMethodRS256, claims)
jwt.Header["kid"] = h.config.JwtKid
t, err := jwt.SignedString(h.config.JwtPrivateKey)
if err != nil {
return "", err

18
auth/devspace.yaml Normal file
View File

@@ -0,0 +1,18 @@
version: v2beta1
name: auth
dev:
auth:
imageSelector: ghcr.io/zoriya/kyoo_auth
devImage: docker.io/golang:1.25
workingDir: /app
sync:
- path: .:/app
startContainer: true
onUpload:
restartContainer: true
command:
- bash
- -c
- "go mod download; go run -race ."
ports:
- port: "4568"

View File

@@ -10,7 +10,7 @@ require (
github.com/golang-jwt/jwt/v5 v5.3.0
github.com/google/uuid v1.6.0
github.com/jackc/pgx/v5 v5.7.6
github.com/labstack/echo-jwt/v4 v4.3.1
github.com/labstack/echo-jwt/v4 v4.4.0
github.com/labstack/echo/v4 v4.13.4
github.com/lestrrat-go/jwx/v3 v3.0.12
github.com/swaggo/echo-swagger v1.4.1
@@ -86,7 +86,7 @@ require (
golang.org/x/sync v0.18.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
golang.org/x/time v0.12.0 // indirect
golang.org/x/time v0.14.0 // indirect
golang.org/x/tools v0.38.0 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect

View File

@@ -93,8 +93,8 @@ github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/labstack/echo-jwt/v4 v4.3.1 h1:d8+/qf8nx7RxeL46LtoIwHJsH2PNN8xXCQ/jDianycE=
github.com/labstack/echo-jwt/v4 v4.3.1/go.mod h1:yJi83kN8S/5vePVPd+7ID75P4PqPNVRs2HVeuvYJH00=
github.com/labstack/echo-jwt/v4 v4.4.0 h1:nrXaEnJupfc2R4XChcLRDyghhMZup77F8nIzHnBK19U=
github.com/labstack/echo-jwt/v4 v4.4.0/go.mod h1:kYXWgWms9iFqI3ldR+HAEj/Zfg5rZtR7ePOgktG4Hjg=
github.com/labstack/echo/v4 v4.13.4 h1:oTZZW+T3s9gAu5L8vmzihV7/lkXGZuITzTQkTEhcXEA=
github.com/labstack/echo/v4 v4.13.4/go.mod h1:g63b33BZ5vZzcIUF8AtRH40DrTlXnx4UMC8rBdndmjQ=
github.com/labstack/gommon v0.4.2 h1:F8qTUNXgG1+6WQmqoUWnz8WiEU60mXVVw0P4ht1WRA0=
@@ -250,8 +250,8 @@ golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.12.0 h1:ScB/8o8olJvc+CQPWrK3fPZNfh7qgwCrY0zJmoEQLSE=
golang.org/x/time v0.12.0/go.mod h1:CDIdPxbZBQxdj6cxyCIdrNogrJKMJ7pr37NYpMcMDSg=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=

View File

@@ -88,7 +88,9 @@ func setupOtel(e *echo.Echo) (func(), error) {
otel.SetTracerProvider(tp)
e.Use(otelecho.Middleware("kyoo.auth", otelecho.WithSkipper(func(c echo.Context) bool {
return c.Path() == "/auth/health" || c.Path() == "/auth/ready"
return (c.Path() == "/auth/health" ||
c.Path() == "/auth/ready" ||
strings.HasPrefix(c.Path(), "/.well-known/"))
})))
return func() {

View File

@@ -7,7 +7,7 @@ pkgs.mkShell {
sqlc
go-swag
# for psql in cli (+ pgformatter for sql files)
postgresql_15
postgresql_18
pgformatter
# to run tests
hurl

2
chart/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
charts

6
chart/Chart.lock Normal file
View File

@@ -0,0 +1,6 @@
dependencies:
- name: postgres
repository: oci://registry-1.docker.io/cloudpirates
version: 0.12.4
digest: sha256:e486b44703c7a97eee25f7715ab040d197d79c41ea1c422ae009b1f68985f544
generated: "2025-12-01T20:17:25.152279487+01:00"

View File

@@ -12,4 +12,4 @@ dependencies:
- condition: postgres.enabled
name: postgres
repository: oci://registry-1.docker.io/cloudpirates
version: 0.11.6
version: 0.12.4

View File

@@ -469,7 +469,7 @@ postgres:
existingSecret: "{{ .Values.global.postgres.infra.existingSecret }}"
secretKeys:
# set the postgres user password to the same as our user
passwordKey: "{{ .Values.global.postgres.infra.passwordKey }}"
adminPasswordKey: "{{ .Values.global.postgres.infra.passwordKey }}"
initdb:
scripts:
kyoo_api.sql: |

18
devspace.yaml Normal file
View File

@@ -0,0 +1,18 @@
version: v2beta1
name: kyoo-devspace
dependencies:
api:
path: ./api
pipeline: dev
auth:
path: ./auth
pipeline: dev
front:
path: ./front
pipeline: dev
scanner:
path: ./scanner
pipeline: dev
transcoder:
path: ./transcoder
pipeline: dev

View File

@@ -88,7 +88,7 @@ services:
env_file:
- ./.env
volumes:
- images:/app/images
- images:/images
labels:
- "traefik.enable=true"
- "traefik.http.routers.swagger.rule=PathPrefix(`/swagger`)"
@@ -177,7 +177,7 @@ services:
profiles: ['qsv']
traefik:
image: traefik:v3.5
image: traefik:v3.6
restart: on-failure
command:
- "--providers.docker=true"
@@ -190,12 +190,12 @@ services:
- "/var/run/docker.sock:/var/run/docker.sock:ro"
postgres:
image: postgres:15
image: postgres:18
restart: on-failure
env_file:
- ./.env
volumes:
- db:/var/lib/postgresql/data
- db:/var/lib/postgresql
ports:
- "5432:5432"
environment:

View File

@@ -58,7 +58,7 @@ services:
env_file:
- ./.env
volumes:
- images:/app/images
- images:/images
labels:
- "traefik.enable=true"
- "traefik.http.routers.swagger.rule=PathPrefix(`/swagger`)"
@@ -126,7 +126,7 @@ services:
profiles: ["qsv"]
traefik:
image: traefik:v3.5
image: traefik:v3.6
restart: unless-stopped
command:
- "--providers.docker=true"
@@ -139,12 +139,12 @@ services:
- "/var/run/docker.sock:/var/run/docker.sock:ro"
postgres:
image: postgres:15
image: postgres:18
restart: unless-stopped
env_file:
- ./.env
volumes:
- db:/var/lib/postgresql/data
- db:/var/lib/postgresql
environment:
- POSTGRES_USER=$PGUSER
- POSTGRES_PASSWORD=$PGPASSWORD

View File

@@ -1,4 +1,4 @@
FROM oven/bun AS builder
FROM --platform=$BUILDPLATFORM oven/bun AS builder
WORKDIR /app
# https://github.com/oven-sh/bun/issues/24538

29
front/devspace.yaml Normal file
View File

@@ -0,0 +1,29 @@
version: v2beta1
name: front
dev:
front:
imageSelector: ghcr.io/zoriya/kyoo_front
devImage: docker.io/oven/bun:latest
workingDir: /app
sync:
- path: .:/app
excludePaths:
- node_modules
startContainer: true
onUpload:
exec:
- command: bun install --frozen-lockfile
onChange:
- "./bun.lock"
# increased sysctl limits for file watching
# front uses Metro javascript bundler which watches a lot of files
# these are node level settings that should be raised
# example values:
# fs.inotify.max_user_instances = 8192
# fs.inotify.max_user_watches = 1048576
command:
- bash
- -c
- "bun install --frozen-lockfile; bun dev --port 8901"
ports:
- port: "8901"

View File

@@ -10,7 +10,7 @@ export const Collection = z
slug: z.string(),
name: z.string(),
original: z.object({
name: z.string(),
name: z.string().nullable(),
latinName: z.string().nullable(),
language: z.string(),
}),

View File

@@ -11,7 +11,7 @@ export const Movie = z
slug: z.string(),
name: z.string(),
original: z.object({
name: z.string(),
name: z.string().nullable(),
latinName: z.string().nullable(),
language: z.string(),
}),

View File

@@ -12,7 +12,7 @@ export const Serie = z
slug: z.string(),
name: z.string(),
original: z.object({
name: z.string(),
name: z.string().nullable(),
latinName: z.string().nullable(),
language: z.string(),
}),

View File

@@ -28,10 +28,12 @@ export const ImageBackground = ({
const { css, theme } = useYoshiki();
const { apiUrl, authToken } = useToken();
const uri = src ? `${apiUrl}${src[quality ?? "high"]}` : null;
return (
<EImageBackground
recyclingKey={uri}
source={{
uri: src ? `${apiUrl}${src[quality ?? "high"]}` : null,
uri,
// use cookies on web to allow `img` to make the call instead of js
headers:
authToken && Platform.OS !== "web"

View File

@@ -37,10 +37,12 @@ export const Image = ({
const { css, theme } = useYoshiki();
const { apiUrl, authToken } = useToken();
const uri = src ? `${apiUrl}${src[quality ?? "high"]}` : null;
return (
<EImage
recyclingKey={uri}
source={{
uri: src ? `${apiUrl}${src[quality ?? "high"]}` : null,
uri,
// use cookies on web to allow `img` to make the call instead of js
headers:
authToken && Platform.OS !== "web"

View File

@@ -84,6 +84,7 @@ export const login = async (
export const logout = async () => {
const accounts = readAccounts();
const account = accounts.find((x) => x.selected);
removeAccounts((x) => x.selected);
if (account) {
await queryFn({
method: "DELETE",
@@ -92,7 +93,6 @@ export const logout = async () => {
parser: null,
});
}
removeAccounts((x) => x.selected);
};
export const deleteAccount = async () => {

25
scanner/devspace.yaml Normal file
View File

@@ -0,0 +1,25 @@
version: v2beta1
name: scanner
dev:
scanner:
imageSelector: ghcr.io/zoriya/kyoo_scanner
devImage: docker.io/astral/uv:python3.13-trixie
workingDir: /app
sync:
- path: .:/app
excludePaths:
- __pycache__
- .venv
startContainer: true
onUpload:
restartContainer: true
command:
- bash
- -c
- |
echo "Running uv sync..."
uv sync --locked || (echo 'uv sync failed' && exit 1)
echo "Starting FastAPI..."
/app/.venv/bin/fastapi run scanner --port 4389
ports:
- port: "4389"

View File

@@ -18,7 +18,8 @@ create table scanner.requests(
external_id jsonb not null default '{}'::jsonb,
videos jsonb not null default '[]'::jsonb,
status scanner.request_status not null default 'pending',
error jsonb,
started_at timestamptz,
created_at timestamptz not null default now()::timestamptz,
constraint unique_kty unique(kind, title, year)
constraint unique_kty unique nulls not distinct (kind, title, year)
);

View File

@@ -5,13 +5,15 @@ from fastapi import FastAPI
from scanner.client import KyooClient
from scanner.fsscan import FsScanner
from scanner.otel import instrument
from scanner.log import configure_logging
from scanner.otel import setup_otelproviders, instrument
from scanner.providers.composite import CompositeProvider
from scanner.providers.themoviedatabase import TheMovieDatabase
from scanner.requests import RequestCreator, RequestProcessor
from .database import get_db, init_pool, migrate
from .routers.routes import router
from .routers.health import router as health_router
@asynccontextmanager
@@ -24,6 +26,12 @@ async def lifespan(_):
):
# there's no way someone else used the same id, right?
is_master = await db.fetchval("select pg_try_advisory_lock(198347)")
is_http = not is_master and await db.fetchval(
"select pg_try_advisory_lock(645633)"
)
if is_http:
yield
return
if is_master:
await migrate()
processor = RequestProcessor(pool, client, tmdb)
@@ -68,4 +76,7 @@ app = FastAPI(
lifespan=lifespan,
)
app.include_router(router)
app.include_router(health_router)
configure_logging()
setup_otelproviders()
instrument(app)

View File

@@ -3,7 +3,7 @@ from logging import getLogger
from types import TracebackType
from typing import Literal
from aiohttp import ClientSession
from aiohttp import ClientResponse, ClientResponseError, ClientSession
from pydantic import TypeAdapter
from .models.movie import Movie
@@ -38,9 +38,19 @@ class KyooClient(metaclass=Singleton):
):
await self._client.close()
async def raise_for_status(self, r: ClientResponse):
if r.status >= 400:
raise ClientResponseError(
r.request_info,
r.history,
status=r.status,
message=await r.text(),
headers=r.headers,
)
async def get_videos_info(self) -> VideoInfo:
async with self._client.get("videos") as r:
r.raise_for_status()
await self.raise_for_status(r)
return VideoInfo(**await r.json())
async def create_videos(self, videos: list[Video]) -> list[VideoCreated]:
@@ -48,7 +58,7 @@ class KyooClient(metaclass=Singleton):
"videos",
data=TypeAdapter(list[Video]).dump_json(videos, by_alias=True),
) as r:
r.raise_for_status()
await self.raise_for_status(r)
return TypeAdapter(list[VideoCreated]).validate_json(await r.text())
async def delete_videos(self, videos: list[str] | set[str]):
@@ -56,14 +66,14 @@ class KyooClient(metaclass=Singleton):
"videos",
data=TypeAdapter(list[str] | set[str]).dump_json(videos, by_alias=True),
) as r:
r.raise_for_status()
await self.raise_for_status(r)
async def create_movie(self, movie: Movie) -> Resource:
async with self._client.post(
"movies",
data=movie.model_dump_json(by_alias=True),
) as r:
r.raise_for_status()
await self.raise_for_status(r)
return Resource.model_validate(await r.json())
async def create_serie(self, serie: Serie) -> Resource:
@@ -71,7 +81,7 @@ class KyooClient(metaclass=Singleton):
"series",
data=serie.model_dump_json(by_alias=True),
) as r:
r.raise_for_status()
await self.raise_for_status(r)
return Resource.model_validate(await r.json())
async def link_videos(
@@ -100,4 +110,4 @@ class KyooClient(metaclass=Singleton):
by_alias=True,
),
) as r:
r.raise_for_status()
await self.raise_for_status(r)

32
scanner/scanner/log.py Normal file
View File

@@ -0,0 +1,32 @@
import logging
import os
import sys
from opentelemetry.sdk._logs import LoggingHandler
def configure_logging():
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
logging.getLogger("watchfiles").setLevel(logging.WARNING)
logging.getLogger("rebulk").setLevel(logging.WARNING)
# Add stdout handler
stdout_handler = logging.StreamHandler(sys.stdout)
# set logging level via STDOUT_LOG_LEVEL env var or default to INFO
stdout_handler.setLevel(
getattr(logging, os.getenv("STDOUT_LOG_LEVEL", "INFO").upper())
)
stdout_handler.setFormatter(
logging.Formatter(
fmt="[{levelname}][{name}] {message}",
style="{",
)
)
root_logger.addHandler(stdout_handler)
# Add OpenTelemetry handler
# set logging level via OTEL_LOG_LEVEL env var
# https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#general-sdk-configuration
root_logger.addHandler(LoggingHandler())

View File

@@ -1,5 +1,6 @@
from __future__ import annotations
from typing import Literal
from datetime import datetime
from typing import Any, Literal
from pydantic import Field
@@ -18,3 +19,17 @@ class Request(Model, extra="allow"):
class Video(Model):
id: str
episodes: list[Guess.Episode]
class RequestRet(Model):
id: str
kind: Literal["episode", "movie"]
title: str
year: int | None
status: Literal[
"pending",
"running",
"failed",
]
error: dict[str, Any] | None
started_at: datetime | None

View File

@@ -1,81 +1,77 @@
import logging
import os
import sys
from fastapi import FastAPI
from opentelemetry import metrics, trace
from opentelemetry._logs import set_logger_provider
from opentelemetry.exporter.otlp.proto.grpc._log_exporter import (
OTLPLogExporter as GrpcLogExporter,
)
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import (
OTLPMetricExporter as GrpcMetricExporter,
)
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
OTLPSpanExporter as GrpcSpanExporter,
)
from opentelemetry.exporter.otlp.proto.http._log_exporter import (
OTLPLogExporter as HttpLogExporter,
)
from opentelemetry.exporter.otlp.proto.http.metric_exporter import (
OTLPMetricExporter as HttpMetricExporter,
)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
OTLPSpanExporter as HttpSpanExporter,
)
from opentelemetry import trace, metrics, _logs
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.sdk.resources import Resource
from opentelemetry.instrumentation.aiohttp_client import AioHttpClientInstrumentor
from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.sdk._logs import LoggerProvider, LoggingHandler
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.sdk.resources import SERVICE_NAME, Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
logger = logging.getLogger(__name__)
def setup_otelproviders() -> tuple[object, object, object]:
import os
if not (os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT", "").strip()):
logger.info(
"OTEL_EXPORTER_OTLP_ENDPOINT not specified, skipping otel provider setup."
)
return None, None, None
# choose exporters (grpc vs http) ...
if os.getenv("OTEL_EXPORTER_OTLP_PROTOCOL", "").lower().strip() == "grpc":
from opentelemetry.exporter.otlp.proto.grpc._log_exporter import OTLPLogExporter
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import (
OTLPMetricExporter,
)
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
OTLPSpanExporter,
)
logger.info("Using gRPC libs for OpenTelemetry exporter.")
else:
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
from opentelemetry.exporter.otlp.proto.http.metric_exporter import (
OTLPMetricExporter,
)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
OTLPSpanExporter,
)
logger.info("Using HTTP libs for OpenTelemetry exporter.")
resource = Resource.create(
{"service.name": os.getenv("OTEL_SERVICE_NAME", "kyoo.scanner")}
)
# Traces
tracer_provider = TracerProvider(resource=resource)
tracer_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
trace.set_tracer_provider(tracer_provider)
# Metrics
meter_provider = MeterProvider(
resource=resource,
metric_readers=[PeriodicExportingMetricReader(OTLPMetricExporter())],
)
metrics.set_meter_provider(meter_provider)
# Logs — install logger provider + processor/exporter
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(BatchLogRecordProcessor(OTLPLogExporter()))
_logs.set_logger_provider(logger_provider)
return tracer_provider, meter_provider, logger_provider
def instrument(app: FastAPI):
proto = os.getenv("OTEL_EXPORTER_OTLP_PROTOCOL", "http/protobuf")
resource = Resource.create(attributes={SERVICE_NAME: "kyoo.scanner"})
provider = LoggerProvider(resource=resource)
provider.add_log_record_processor(
BatchLogRecordProcessor(
HttpLogExporter() if proto == "http/protobuf" else GrpcLogExporter()
)
)
set_logger_provider(provider)
logging.basicConfig(
handlers=[
LoggingHandler(level=logging.DEBUG, logger_provider=provider),
logging.StreamHandler(sys.stdout),
],
level=logging.DEBUG,
)
logging.getLogger("watchfiles").setLevel(logging.WARNING)
logging.getLogger("rebulk").setLevel(logging.WARNING)
provider = TracerProvider(resource=resource)
provider.add_span_processor(
BatchSpanProcessor(
HttpSpanExporter() if proto == "http/protobuf" else GrpcSpanExporter()
)
)
trace.set_tracer_provider(provider)
provider = MeterProvider(
metric_readers=[
PeriodicExportingMetricReader(
HttpMetricExporter()
if proto == "http/protobuf"
else GrpcMetricExporter()
)
],
resource=resource,
)
metrics.set_meter_provider(provider)
FastAPIInstrumentor.instrument_app(
app,
http_capture_headers_server_request=[".*"],

View File

@@ -47,7 +47,7 @@ class Provider(ABC):
search = await self.search_movies(title, year, language=[])
if not any(search):
raise ProviderError(
f"Couldn't find a movie with title {title}. (year: {year}"
f"Couldn't find a movie with title {title}. (year: {year})"
)
ret = await self.get_movie(
{k: v.data_id for k, v in search[0].external_id.items()}
@@ -68,7 +68,7 @@ class Provider(ABC):
search = await self.search_series(title, year, language=[])
if not any(search):
raise ProviderError(
f"Couldn't find a serie with title {title}. (year: {year}"
f"Couldn't find a serie with title {title}. (year: {year})"
)
ret = await self.get_serie(
{k: v.data_id for k, v in search[0].external_id.items()}

View File

@@ -420,6 +420,8 @@ class TheMovieDatabase(Provider):
(x["episode_number"] for x in season["episodes"]), None
),
"entries_count": len(season["episodes"]),
# there can be gaps in episodes (like 1,2,5,6,7)
"episodes": [x["episode_number"] for x in season["episodes"]],
},
)
@@ -429,9 +431,9 @@ class TheMovieDatabase(Provider):
# TODO: batch those
ret = await asyncio.gather(
*[
self._get_entry(serie_id, s.season_number, s.extra["first_entry"] + e)
self._get_entry(serie_id, s.season_number, e)
for s in seasons
for e in range(0, s.extra["entries_count"])
for e in s.extra["episodes"]
]
)

View File

@@ -1,5 +1,6 @@
from asyncio import CancelledError, Event, TaskGroup
from logging import getLogger
from traceback import TracebackException
from typing import cast
from asyncpg import Connection, Pool
@@ -40,6 +41,8 @@ class RequestCreator:
"""
delete from scanner.requests
where status = 'failed'
or (status = 'running'
and now() - started_at > interval '1 hour')
"""
)
@@ -161,11 +164,22 @@ class RequestProcessor:
update
scanner.requests
set
status = 'failed'
status = 'failed',
error = $2
where
pk = $1
""",
request.pk,
{
"title": type(e).__name__,
"message": str(e),
"traceback": [
line
for part in TracebackException.from_exception(e).format()
for line in part.split("\n")
if line.strip()
],
},
)
return True

View File

@@ -0,0 +1,15 @@
from fastapi import APIRouter
router = APIRouter()
@router.get("/health")
def get_health():
return {"status": "healthy"}
@router.get("/ready")
def get_ready():
# child spans (`select 1` & db connection reset) was still logged,
# since i don't really wanna deal with it, let's just do that.
return {"status": "healthy"}

View File

@@ -1,9 +1,9 @@
from typing import Annotated
from typing import Annotated, Literal
from asyncpg import Connection
from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException, Security
from fastapi import APIRouter, BackgroundTasks, Depends, Security
from scanner.database import get_db_fapi
from scanner.models.request import RequestRet
from scanner.status import StatusService
from ..fsscan import create_scanner
from ..jwt import validate_bearer
@@ -11,6 +11,19 @@ from ..jwt import validate_bearer
router = APIRouter()
@router.get("/scan")
async def get_scan_status(
svc: Annotated[StatusService, Depends(StatusService.create)],
_: Annotated[None, Security(validate_bearer, scopes=["scanner.trigger"])],
status: Literal["pending", "running", "failed"] | None = None,
) -> list[RequestRet]:
"""
Get scan status, know what tasks are running, pending or failed.
"""
return await svc.list_requests(status=status)
@router.put(
"/scan",
status_code=204,
@@ -29,25 +42,3 @@ async def trigger_scan(
await scanner.scan()
tasks.add_task(run)
@router.get("/health")
def get_health():
return {"status": "healthy"}
@router.get("/ready")
def get_ready():
# child spans (`select 1` & db connection reset) was still logged,
# since i don't really wanna deal with it, let's just do that.
return {"status": "healthy"}
# async def get_ready(db: Annotated[Connection, Depends(get_db_fapi)]):
# try:
# _ = await db.execute("select 1")
# return {"status": "healthy", "database": "healthy"}
# except Exception as e:
# raise HTTPException(
# status_code=500, detail={"status": "unhealthy", "database": str(e)}
# )

42
scanner/scanner/status.py Normal file
View File

@@ -0,0 +1,42 @@
from typing import Literal
from asyncpg import Connection
from pydantic import TypeAdapter
from scanner.database import get_db
from .models.request import RequestRet
class StatusService:
def __init__(self, database: Connection):
self._database = database
@classmethod
async def create(cls):
async with get_db() as db:
yield StatusService(db)
async def list_requests(
self, *, status: Literal["pending", "running", "failed"] | None = None
) -> list[RequestRet]:
ret = await self._database.fetch(
f"""
select
pk::text as id,
kind,
title,
year,
status,
error,
started_at
from
scanner.requests
order by
started_at,
pk
{"where status = $1" if status is not None else ""}
""",
*([status] if status is not None else []),
)
return TypeAdapter(list[RequestRet]).validate_python(ret)

View File

@@ -10,8 +10,15 @@ pkgs.mkShell {
(import ./transcoder/shell.nix {inherit pkgs;})
];
packages = [
pkgs.devspace
];
# env vars aren't inherited from the `inputsFrom`
SHARP_FORCE_GLOBAL_LIBVIPS = 1;
shellHook = ''
export LD_LIBRARY_PATH=${pkgs.stdenv.cc.cc.lib}/lib:$LD_LIBRARY_PATH
'';
UV_PYTHON_PREFERENCE = "only-system";
UV_PYTHON = pkgs.python313;
}

41
transcoder/devspace.yaml Normal file
View File

@@ -0,0 +1,41 @@
version: v2beta1
name: transcoder
dev:
transcoder:
imageSelector: ghcr.io/zoriya/kyoo_transcoder
# would be good to publish the kyoo_transcoder builder image with all deps installed
devImage: golang:1.25
workingDir: /app
sync:
- path: .:/app
startContainer: true
onUpload:
restartContainer: true
command:
- bash
- -c
- |
echo "Determining architecture..."
ARCH=$(dpkg --print-architecture)
echo "Container architecture: $ARCH"
echo "Updating apt and installing transcoder dependencies..."
apt-get update && \
apt-get install --no-install-recommends --no-install-suggests -y \
ffmpeg \
libavformat-dev libavutil-dev libswscale-dev \
vainfo mesa-va-drivers \
ca-certificates
if [ "$ARCH" = "amd64" ]; then
echo "Installing Intel VAAPI drivers (amd64 only)..."
apt-get install -y intel-media-va-driver-non-free i965-va-driver-shaders
else
echo "Skipping Intel VAAPI drivers for arch $ARCH"
fi
echo "Downloading Go dependencies..."
go mod download
echo "Starting transcoder..."
go run -race .
ports:
- port: "7666"

View File

@@ -4,13 +4,13 @@ go 1.24.2
require (
github.com/asticode/go-astisub v0.38.0
github.com/aws/aws-sdk-go-v2 v1.39.6
github.com/aws/aws-sdk-go-v2/service/s3 v1.90.2
github.com/aws/aws-sdk-go-v2 v1.40.0
github.com/aws/aws-sdk-go-v2/service/s3 v1.92.1
github.com/disintegration/imaging v1.6.2
github.com/exaring/otelpgx v0.9.3
github.com/golang-migrate/migrate/v4 v4.19.0
github.com/jackc/pgx/v5 v5.7.6
github.com/labstack/echo-jwt/v4 v4.3.1
github.com/labstack/echo-jwt/v4 v4.4.0
github.com/labstack/echo/v4 v4.13.4
github.com/swaggo/echo-swagger v1.4.1
github.com/swaggo/swag v1.16.6
@@ -36,6 +36,7 @@ require (
github.com/KyleBanks/depth v1.2.1 // indirect
github.com/asticode/go-astikit v0.57.1 // indirect
github.com/asticode/go-astits v1.14.0 // indirect
github.com/aws/aws-sdk-go-v2/service/signin v1.0.2 // indirect
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/ghodss/yaml v1.0.0 // indirect
github.com/go-logr/logr v1.4.3 // indirect
@@ -73,20 +74,20 @@ require (
require (
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.3 // indirect
github.com/aws/aws-sdk-go-v2/config v1.31.20
github.com/aws/aws-sdk-go-v2/credentials v1.18.24 // indirect
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.13 // indirect
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.13 // indirect
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.13 // indirect
github.com/aws/aws-sdk-go-v2/config v1.32.2
github.com/aws/aws-sdk-go-v2/credentials v1.19.2 // indirect
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.14 // indirect
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.14 // indirect
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.14 // indirect
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4 // indirect
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.13 // indirect
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.14 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.3 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.4 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.13 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.13 // indirect
github.com/aws/aws-sdk-go-v2/service/sso v1.30.3 // indirect
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.7 // indirect
github.com/aws/aws-sdk-go-v2/service/sts v1.40.2 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.5 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.14 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.14 // indirect
github.com/aws/aws-sdk-go-v2/service/sso v1.30.5 // indirect
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.10 // indirect
github.com/aws/aws-sdk-go-v2/service/sts v1.41.2 // indirect
github.com/aws/smithy-go v1.23.2 // indirect
github.com/decred/dcrd/dcrec/secp256k1/v4 v4.4.0 // indirect
github.com/goccy/go-json v0.10.5 // indirect

View File

@@ -13,40 +13,42 @@ github.com/asticode/go-astisub v0.38.0/go.mod h1:WTkuSzFB+Bp7wezuSf2Oxulj5A8zu2z
github.com/asticode/go-astits v1.8.0/go.mod h1:DkOWmBNQpnr9mv24KfZjq4JawCFX1FCqjLVGvO0DygQ=
github.com/asticode/go-astits v1.14.0 h1:zkgnZzipx2XX5mWycqsSBeEyDH58+i4HtyF4j2ROb00=
github.com/asticode/go-astits v1.14.0/go.mod h1:QSHmknZ51pf6KJdHKZHJTLlMegIrhega3LPWz3ND/iI=
github.com/aws/aws-sdk-go-v2 v1.39.6 h1:2JrPCVgWJm7bm83BDwY5z8ietmeJUbh3O2ACnn+Xsqk=
github.com/aws/aws-sdk-go-v2 v1.39.6/go.mod h1:c9pm7VwuW0UPxAEYGyTmyurVcNrbF6Rt/wixFqDhcjE=
github.com/aws/aws-sdk-go-v2 v1.40.0 h1:/WMUA0kjhZExjOQN2z3oLALDREea1A7TobfuiBrKlwc=
github.com/aws/aws-sdk-go-v2 v1.40.0/go.mod h1:c9pm7VwuW0UPxAEYGyTmyurVcNrbF6Rt/wixFqDhcjE=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.3 h1:DHctwEM8P8iTXFxC/QK0MRjwEpWQeM9yzidCRjldUz0=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.3/go.mod h1:xdCzcZEtnSTKVDOmUZs4l/j3pSV6rpo1WXl5ugNsL8Y=
github.com/aws/aws-sdk-go-v2/config v1.31.20 h1:/jWF4Wu90EhKCgjTdy1DGxcbcbNrjfBHvksEL79tfQc=
github.com/aws/aws-sdk-go-v2/config v1.31.20/go.mod h1:95Hh1Tc5VYKL9NJ7tAkDcqeKt+MCXQB1hQZaRdJIZE0=
github.com/aws/aws-sdk-go-v2/credentials v1.18.24 h1:iJ2FmPT35EaIB0+kMa6TnQ+PwG5A1prEdAw+PsMzfHg=
github.com/aws/aws-sdk-go-v2/credentials v1.18.24/go.mod h1:U91+DrfjAiXPDEGYhh/x29o4p0qHX5HDqG7y5VViv64=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.13 h1:T1brd5dR3/fzNFAQch/iBKeX07/ffu/cLu+q+RuzEWk=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.13/go.mod h1:Peg/GBAQ6JDt+RoBf4meB1wylmAipb7Kg2ZFakZTlwk=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.13 h1:a+8/MLcWlIxo1lF9xaGt3J/u3yOZx+CdSveSNwjhD40=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.13/go.mod h1:oGnKwIYZ4XttyU2JWxFrwvhF6YKiK/9/wmE3v3Iu9K8=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.13 h1:HBSI2kDkMdWz4ZM7FjwE7e/pWDEZ+nR95x8Ztet1ooY=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.13/go.mod h1:YE94ZoDArI7awZqJzBAZ3PDD2zSfuP7w6P2knOzIn8M=
github.com/aws/aws-sdk-go-v2/config v1.32.2 h1:4liUsdEpUUPZs5WVapsJLx5NPmQhQdez7nYFcovrytk=
github.com/aws/aws-sdk-go-v2/config v1.32.2/go.mod h1:l0hs06IFz1eCT+jTacU/qZtC33nvcnLADAPL/XyrkZI=
github.com/aws/aws-sdk-go-v2/credentials v1.19.2 h1:qZry8VUyTK4VIo5aEdUcBjPZHL2v4FyQ3QEOaWcFLu4=
github.com/aws/aws-sdk-go-v2/credentials v1.19.2/go.mod h1:YUqm5a1/kBnoK+/NY5WEiMocZihKSo15/tJdmdXnM5g=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.14 h1:WZVR5DbDgxzA0BJeudId89Kmgy6DIU4ORpxwsVHz0qA=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.14/go.mod h1:Dadl9QO0kHgbrH1GRqGiZdYtW5w+IXXaBNCHTIaheM4=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.14 h1:PZHqQACxYb8mYgms4RZbhZG0a7dPW06xOjmaH0EJC/I=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.14/go.mod h1:VymhrMJUWs69D8u0/lZ7jSB6WgaG/NqHi3gX0aYf6U0=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.14 h1:bOS19y6zlJwagBfHxs0ESzr1XCOU2KXJCWcq3E2vfjY=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.14/go.mod h1:1ipeGBMAxZ0xcTm6y6paC2C/J6f6OO7LBODV9afuAyM=
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4 h1:WKuaxf++XKWlHWu9ECbMlha8WOEGm0OUEZqm4K/Gcfk=
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4/go.mod h1:ZWy7j6v1vWGmPReu0iSGvRiise4YI5SkR3OHKTZ6Wuc=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.13 h1:eg/WYAa12vqTphzIdWMzqYRVKKnCboVPRlvaybNCqPA=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.13/go.mod h1:/FDdxWhz1486obGrKKC1HONd7krpk38LBt+dutLcN9k=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.14 h1:ITi7qiDSv/mSGDSWNpZ4k4Ve0DQR6Ug2SJQ8zEHoDXg=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.14/go.mod h1:k1xtME53H1b6YpZt74YmwlONMWf4ecM+lut1WQLAF/U=
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.3 h1:x2Ibm/Af8Fi+BH+Hsn9TXGdT+hKbDd5XOTZxTMxDk7o=
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.3/go.mod h1:IW1jwyrQgMdhisceG8fQLmQIydcT/jWY21rFhzgaKwo=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.4 h1:NvMjwvv8hpGUILarKw7Z4Q0w1H9anXKsesMxtw++MA4=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.4/go.mod h1:455WPHSwaGj2waRSpQp7TsnpOnBfw8iDfPfbwl7KPJE=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.13 h1:kDqdFvMY4AtKoACfzIGD8A0+hbT41KTKF//gq7jITfM=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.13/go.mod h1:lmKuogqSU3HzQCwZ9ZtcqOc5XGMqtDK7OIc2+DxiUEg=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.13 h1:zhBJXdhWIFZ1acfDYIhu4+LCzdUS2Vbcum7D01dXlHQ=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.13/go.mod h1:JaaOeCE368qn2Hzi3sEzY6FgAZVCIYcC2nwbro2QCh8=
github.com/aws/aws-sdk-go-v2/service/s3 v1.90.2 h1:DhdbtDl4FdNlj31+xiRXANxEE+eC7n8JQz+/ilwQ8Uc=
github.com/aws/aws-sdk-go-v2/service/s3 v1.90.2/go.mod h1:+wArOOrcHUevqdto9k1tKOF5++YTe9JEcPSc9Tx2ZSw=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.3 h1:NjShtS1t8r5LUfFVtFeI8xLAHQNTa7UI0VawXlrBMFQ=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.3/go.mod h1:fKvyjJcz63iL/ftA6RaM8sRCtN4r4zl4tjL3qw5ec7k=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.7 h1:gTsnx0xXNQ6SBbymoDvcoRHL+q4l/dAFsQuKfDWSaGc=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.7/go.mod h1:klO+ejMvYsB4QATfEOIXk8WAEwN4N0aBfJpvC+5SZBo=
github.com/aws/aws-sdk-go-v2/service/sts v1.40.2 h1:HK5ON3KmQV2HcAunnx4sKLB9aPf3gKGwVAf7xnx0QT0=
github.com/aws/aws-sdk-go-v2/service/sts v1.40.2/go.mod h1:E19xDjpzPZC7LS2knI9E6BaRFDK43Eul7vd6rSq2HWk=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.5 h1:Hjkh7kE6D81PgrHlE/m9gx+4TyyeLHuY8xJs7yXN5C4=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.5/go.mod h1:nPRXgyCfAurhyaTMoBMwRBYBhaHI4lNPAnJmjM0Tslc=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.14 h1:FIouAnCE46kyYqyhs0XEBDFFSREtdnr8HQuLPQPLCrY=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.14/go.mod h1:UTwDc5COa5+guonQU8qBikJo1ZJ4ln2r1MkF7Dqag1E=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.14 h1:FzQE21lNtUor0Fb7QNgnEyiRCBlolLTX/Z1j65S7teM=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.14/go.mod h1:s1ydyWG9pm3ZwmmYN21HKyG9WzAZhYVW85wMHs5FV6w=
github.com/aws/aws-sdk-go-v2/service/s3 v1.92.1 h1:OgQy/+0+Kc3khtqiEOk23xQAglXi3Tj0y5doOxbi5tg=
github.com/aws/aws-sdk-go-v2/service/s3 v1.92.1/go.mod h1:wYNqY3L02Z3IgRYxOBPH9I1zD9Cjh9hI5QOy/eOjQvw=
github.com/aws/aws-sdk-go-v2/service/signin v1.0.2 h1:MxMBdKTYBjPQChlJhi4qlEueqB1p1KcbTEa7tD5aqPs=
github.com/aws/aws-sdk-go-v2/service/signin v1.0.2/go.mod h1:iS6EPmNeqCsGo+xQmXv0jIMjyYtQfnwg36zl2FwEouk=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.5 h1:ksUT5KtgpZd3SAiFJNJ0AFEJVva3gjBmN7eXUZjzUwQ=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.5/go.mod h1:av+ArJpoYf3pgyrj6tcehSFW+y9/QvAY8kMooR9bZCw=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.10 h1:GtsxyiF3Nd3JahRBJbxLCCdYW9ltGQYrFWg8XdkGDd8=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.10/go.mod h1:/j67Z5XBVDx8nZVp9EuFM9/BS5dvBznbqILGuu73hug=
github.com/aws/aws-sdk-go-v2/service/sts v1.41.2 h1:a5UTtD4mHBU3t0o6aHQZFJTNKVfxFWfPX7J0Lr7G+uY=
github.com/aws/aws-sdk-go-v2/service/sts v1.41.2/go.mod h1:6TxbXoDSgBQ225Qd8Q+MbxUxUh6TtNKwbRt/EPS9xso=
github.com/aws/smithy-go v1.23.2 h1:Crv0eatJUQhaManss33hS5r40CG3ZFH+21XSkqMrIUM=
github.com/aws/smithy-go v1.23.2/go.mod h1:LEj2LM3rBRQJxPZTB4KuzZkaZYnZPnvgIhb4pu07mx0=
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
@@ -145,8 +147,8 @@ github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/labstack/echo-jwt/v4 v4.3.1 h1:d8+/qf8nx7RxeL46LtoIwHJsH2PNN8xXCQ/jDianycE=
github.com/labstack/echo-jwt/v4 v4.3.1/go.mod h1:yJi83kN8S/5vePVPd+7ID75P4PqPNVRs2HVeuvYJH00=
github.com/labstack/echo-jwt/v4 v4.4.0 h1:nrXaEnJupfc2R4XChcLRDyghhMZup77F8nIzHnBK19U=
github.com/labstack/echo-jwt/v4 v4.4.0/go.mod h1:kYXWgWms9iFqI3ldR+HAEj/Zfg5rZtR7ePOgktG4Hjg=
github.com/labstack/echo/v4 v4.13.4 h1:oTZZW+T3s9gAu5L8vmzihV7/lkXGZuITzTQkTEhcXEA=
github.com/labstack/echo/v4 v4.13.4/go.mod h1:g63b33BZ5vZzcIUF8AtRH40DrTlXnx4UMC8rBdndmjQ=
github.com/labstack/gommon v0.4.2 h1:F8qTUNXgG1+6WQmqoUWnz8WiEU60mXVVw0P4ht1WRA0=

View File

@@ -6,7 +6,7 @@ pkgs.mkShell {
go-migrate
go-swag
# for psql in cli (+ pgformatter for sql files)
postgresql_15
postgresql_18
pgformatter
# to debug video files
mediainfo