Compare commits

..

53 Commits

Author SHA1 Message Date
github-merge-queue
83a864607c flake.lock: Update 2026-03-09 00:38:49 +00:00
Janne Heß
a9581bcdc4 Merge pull request #1564 from d-goldin/fix/github-diff-url
fix: Github diffs URL
2026-01-30 19:11:11 +00:00
Dima
26e4d5eb54 fix: Github diffs URL
In https://github.com/NixOS/hydra/pull/1549 diffs were
offloaded to github for performance reasons.

While in some endpoints github accepts `.git` suffixed in the
repository name, in the comparison endpoint this does not seem
to be the case.

Specifically, on the main nixos org hydra this isn't working:

Example job: https://hydra.nixos.org/build/320178054

Generates a comparison link like so:
078d69f039...1cd347bf33

This just stips away the suffix and seems to work fine in local
testing.
2026-01-27 01:45:51 +01:00
John Ericson
8bc95a96f7 Merge pull request #1559 from NixOS/bump-nix
bump to nix v2.33
2026-01-23 23:56:48 +00:00
Amaan Qureshi
82cd5e0e23 Fix build after Nix bump 2026-01-23 18:49:40 -05:00
Jörg Thalheim
c3ed183c64 bump to nix v2.33 2026-01-23 18:49:35 -05:00
John Ericson
b45f0d1fa7 Merge pull request #1556 from Mindavi/bugfix/perlcritic-fixes
treewide: update split calls to make perlcritic happy
2026-01-23 23:22:23 +00:00
Rick van Schijndel
e4fe9d43c1 treewide: update split calls to make perlcritic happy
In nixpkgs this started to fail the hydra tests.
It's not completely clear why because it seems the perlcritic
rule has existed for quite some time.

Anyway, this should solve the issues.
2026-01-17 15:55:29 +01:00
Janne Heß
9df4b65c67 Merge pull request #1558 from NixOS/schema-changes
meson: add missing schema migration
2026-01-14 13:11:42 +00:00
Janne Heß
1d011baed8 Merge pull request #1557 from NixOS/update-flakes
Update flake inputs
2026-01-14 09:19:24 +00:00
github-merge-queue
52b2e4f021 flake.lock: Update 2026-01-14 09:53:02 +01:00
Jörg Thalheim
f089ff87f5 build: automatically include all sql files
To prevent issues as in 43006db8 we can just install all sql files by
default
2026-01-14 09:45:57 +01:00
Jörg Thalheim
43006db835 meson: add missing schema file
This is missing from: https://github.com/NixOS/hydra/pull/1548
2026-01-14 09:39:43 +01:00
Janne Heß
4ebfaba862 Merge pull request #1548 from NixOS/fix/hashlengths
feat: Use short revision from git
2026-01-13 14:34:55 +00:00
Janne Heß
41daeb3cc9 Merge pull request #1553 from NixOS/dependabot/github_actions/actions/checkout-6
build(deps): bump actions/checkout from 3 to 6
2026-01-05 15:25:52 +00:00
Janne Heß
3b1b5009f3 Merge pull request #1552 from NixOS/dependabot/github_actions/peter-evans/create-pull-request-8
build(deps): bump peter-evans/create-pull-request from 5 to 8
2026-01-05 15:23:08 +00:00
dependabot[bot]
54699ae671 build(deps): bump actions/checkout from 3 to 6
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-05 15:00:59 +00:00
dependabot[bot]
503871bac4 build(deps): bump peter-evans/create-pull-request from 5 to 8
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 5 to 8.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](https://github.com/peter-evans/create-pull-request/compare/v5...v8)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-version: '8'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-05 15:00:54 +00:00
Janne Heß
6dde18cb5e Merge pull request #1551 from knedlsepp/fix-scmdiff
Fix broken api/scmdiff endpoint
2026-01-05 13:37:12 +00:00
Josef Kemetmüller
e0f65d4d3d Fix broken api/scmdiff endpoint
Same fix as in #1215, which got accidentally removed in #1506.
2026-01-05 14:16:47 +01:00
Janne Heß
5e2e9672cf Merge pull request #1549 from NixOS/feat/github-diffs
feat: Offload git diffs to GitHub
2026-01-05 13:10:00 +00:00
Janne Heß
650871b586 Merge pull request #602 from kquick/pathinput_freq
Allow PathInput to take an optional frequency parameter.
2026-01-04 19:09:26 +00:00
Janne Heß
b2030cd4ef Merge pull request #1294 from arianvp/patch-2
Document redirects in Hydra API
2026-01-04 18:55:13 +00:00
Janne Heß
44780c786e Merge branch 'master' into pathinput_freq 2026-01-04 19:05:05 +01:00
Janne Heß
2db62e86e7 feat: Store the short rev length 2026-01-04 19:01:49 +01:00
Arian van Putten
b88b06dd3c Document redirects in Hydra API
This documents useful redirects that Hydra exposes
2026-01-04 19:01:31 +01:00
Janne Heß
d042e3c82c refactor: Revision for the frontend from one place 2026-01-04 18:23:44 +01:00
Janne Heß
a31f5d654c Merge pull request #1270 from b-bondurant/sysbuild-fix
Use project name in sysbuild query
2026-01-04 15:53:17 +00:00
Janne Heß
6659391e26 Merge pull request #1252 from MaxHearnden/master
Only guess domain when gitea_url is not set
2026-01-04 15:38:42 +00:00
Janne Heß
3b901f19a4 Merge branch 'master' into sysbuild-fix 2026-01-04 16:30:35 +01:00
Janne Heß
673e18415e Merge branch 'master' into master 2026-01-04 16:29:27 +01:00
Janne Heß
13ddeeb08c Merge pull request #1550 from NixOS/feat/update-flake
feat: Update flake inputs
2026-01-04 15:22:22 +00:00
Janne Heß
ed15b0a8ce feat: Update flake inputs 2026-01-04 15:58:57 +01:00
Janne Heß
f1b26134d7 feat: Offload git diffs to GitHub
If we are on GitHub, use their scm diff by default which is more
feature-rich and offloads the diff work to stronger infrastructure
2026-01-04 15:49:25 +01:00
Janne Heß
425d78763d Merge pull request #1543 from diogotcorreia/fix-link-not-in-last-eval
fix: broken anchor tag in job.tt
2026-01-04 13:39:25 +00:00
Janne Heß
53d8e26b59 Merge pull request #1546 from jmbaur/jared/local-repro
build: quote flake URI for local repro instructions
2026-01-04 13:38:45 +00:00
Janne Heß
a439b7f591 Merge pull request #1547 from emhamm/hydra-fix-gitlab-pull-with-umlaute
hydra/plugins/gitlabpulls: use utf-8 encoding for gitlab-pulls-sorted…
2026-01-04 13:38:22 +00:00
Marian Hammer
7d12fa6a55 hydra/plugins/gitlabpulls: use utf-8 encoding for gitlab-pulls-sorted.json
unbreaks umlaute
2025-12-12 14:40:03 +01:00
Jared Baur
7a67ba925d build: quote flake URI for local repro instructions
Often times flake URIs have ampersands in them, making them unsuitable
for pasting into shells directly.
2025-12-10 14:17:45 -08:00
Diogo Correia
662d1198d4 fix: broken anchor tag in job.tt 2025-12-05 00:52:06 +01:00
John Ericson
34ff66a460 Merge pull request #1541 from NixOS/nixos-25.11
flake.nix: update to nixos-25.11
2025-11-25 21:23:57 +00:00
Martin Weinelt
7a42a3810c flake.nix: update to nixos-25.11
And squashes eval warnings from accessing pkgs.hostPlatform.
2025-11-25 15:23:17 +01:00
Martin Weinelt
3bd87005f7 Merge pull request #1540 from NixOS/pg17-update
package.nix: update postgresql to 17
2025-11-25 12:50:27 +00:00
Martin Weinelt
95fb69f60d package.nix: update postgresql to 17
NixOS 25.11 does not ship with PostgreSQL 13 any more.
2025-11-25 13:27:04 +01:00
Jörg Thalheim
241ab71800 Merge pull request #1536 from NixOS/fix-1535
Revert "Deduplicate protocol code more with `ServeProto::BasicClientConnection`
2025-11-06 19:23:48 +00:00
Jörg Thalheim
78ed8d7aa5 Merge pull request #1533 from hacker1024/patch-3
GithubRefs: Allow arbitrary ref types
2025-11-06 09:38:05 +00:00
John Ericson
4bd941daa8 Revert "Deduplicate protocol code more with ServeProto::BasicClientConnection"
This reverts commit 58846b0a1c.
2025-10-30 14:01:38 -04:00
Joshua Leivenzon
d7b40c4233 GithubRefs: Allow arbitrary ref types
GitHub's reference list API does not actually restrict the specified type, so don't artificially restrict it.

The API does not actually make a distinction between the "type" and "prefix" at all, but this is maintained for backwards compatibility. The two are simply concatenated.
2025-10-16 16:35:31 +11:00
Brad Bondurant
c6263c280c use project name in sysbuild query 2023-01-04 15:45:14 -05:00
MaxHearnden
4a0c5a2570 Only guess domain when gitea_url is not set
allows for gitea integration when not using a uri e.g. gitea@example.com:example/example so long as gitea_http_url is set
2022-10-13 15:15:45 +01:00
Kevin Quick
23fa93c5f8 Better update of timeout for the PathInput handler. 2020-06-09 09:00:02 -07:00
Kevin Quick
66730993fc Reconcile with changes from pullreq #775 2020-06-09 08:55:46 -07:00
Kevin Quick
25d1e8900a Allow PathInput to take an optional frequency parameter.
The previous version hard-coded the cache check frequency to 30
seconds.  This meant that the path was checked very frequently (max of
30 seconds and the evaluation period of the job), which could be
problematic for URL PathInput specifications, and especially ones that
are automatically updated frequently without *each* update necessarily
being interesting (an example: the haskell hackage index file.)
2020-06-09 08:48:22 -07:00
33 changed files with 283 additions and 193 deletions

View File

@@ -16,7 +16,7 @@ jobs:
runner: ubuntu-24.04-arm
runs-on: ${{ matrix.runner }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v6
with:
fetch-depth: 0
- uses: cachix/install-nix-action@v31

View File

@@ -11,12 +11,12 @@ jobs:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v6
- uses: cachix/install-nix-action@v31
- name: Update flake inputs
run: nix flake update
- name: Create Pull Request
uses: peter-evans/create-pull-request@v5
uses: peter-evans/create-pull-request@v8
with:
commit-message: "flake.lock: Update"
title: "Update flake inputs"

24
flake.lock generated
View File

@@ -3,16 +3,16 @@
"nix": {
"flake": false,
"locked": {
"lastModified": 1760573252,
"narHash": "sha256-mcvNeNdJP5R7huOc8Neg0qZESx/0DMg8Fq6lsdx0x8U=",
"lastModified": 1772065213,
"narHash": "sha256-DbYpmZAD6aebwxepBop5Ub4S39sLg9UIJziTbeD832k=",
"owner": "NixOS",
"repo": "nix",
"rev": "3c39583e5512729f9c5a44c3b03b6467a2acd963",
"rev": "0769726d44b0782fecbd7b9749e24320c77af317",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "2.32-maintenance",
"ref": "2.33-maintenance",
"repo": "nix",
"type": "github"
}
@@ -20,32 +20,32 @@
"nix-eval-jobs": {
"flake": false,
"locked": {
"lastModified": 1760478325,
"narHash": "sha256-hA+NOH8KDcsuvH7vJqSwk74PyZP3MtvI/l+CggZcnTc=",
"lastModified": 1767025318,
"narHash": "sha256-i68miKHGdueWggcDAF+Kca9g6S3ipkW629XbMpQYfn0=",
"owner": "nix-community",
"repo": "nix-eval-jobs",
"rev": "daa42f9e9c84aeff1e325dd50fda321f53dfd02c",
"rev": "79dd7adbb5f75b08fb4b9bddd712ebc52baa46bc",
"type": "github"
},
"original": {
"owner": "nix-community",
"ref": "v2.32.1",
"ref": "v2.33.0",
"repo": "nix-eval-jobs",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1759652726,
"narHash": "sha256-2VjnimOYDRb3DZHyQ2WH2KCouFqYm9h0Rr007Al/WSA=",
"lastModified": 1772934839,
"narHash": "sha256-6mMYkB7BTTsc4thtCFbh3Aj5yth3EPI6L9L5DR6tpWc=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "06b2985f0cc9eb4318bf607168f4b15af1e5e81d",
"rev": "d351a3bce30b8f0d0a36281754b62942977fabe5",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-25.05-small",
"ref": "nixos-25.11-small",
"repo": "nixpkgs",
"type": "github"
}

View File

@@ -1,16 +1,16 @@
{
description = "A Nix-based continuous build system";
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-25.05-small";
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-25.11-small";
inputs.nix = {
url = "github:NixOS/nix/2.32-maintenance";
url = "github:NixOS/nix/2.33-maintenance";
# We want to control the deps precisely
flake = false;
};
inputs.nix-eval-jobs = {
url = "github:nix-community/nix-eval-jobs/v2.32.1";
url = "github:nix-community/nix-eval-jobs/v2.33.0";
# We want to control the deps precisely
flake = false;
};
@@ -59,7 +59,7 @@
manual = forEachSystem (system: let
pkgs = nixpkgs.legacyPackages.${system};
hydra = self.packages.${pkgs.hostPlatform.system}.hydra;
hydra = self.packages.${pkgs.stdenv.hostPlatform.system}.hydra;
in
pkgs.runCommand "hydra-manual-${hydra.version}" { }
''

View File

@@ -574,6 +574,131 @@ paths:
schema:
$ref: '#/components/schemas/JobsetEvalBuilds'
/jobset/{project-id}/{jobset-id}/latest-eval:
get:
summary: Redirects to the latest finished evaluation for a jobset
parameters:
- name: project-id
in: path
description: project identifier
required: true
schema:
type: string
- name: jobset-id
in: path
description: jobset identifier
required: true
schema:
type: string
responses:
'302':
description: the evaluation to redirect to
headers:
Location:
example: /eval/1?name={jobset-id}
schema:
type: string
/job/{project-id}/{jobset-id}/{job-id}/latest:
get:
summary: Redirects to the latest succesful build for a job
parameters:
- name: project-id
in: path
description: project identifier
required: true
schema:
type: string
- name: jobset-id
in: path
description: jobset identifier
required: true
schema:
type: string
- name: job-id
in: path
description: job identifier
required: true
schema:
type: string
responses:
'302':
description: the build to redirect to
headers:
Location:
example: /build/1
schema:
type: string
/job/{project-id}/{jobset-id}/{job-id}/latest-for/{system}:
get:
summary: Redirects to the latest succesful build for a job
parameters:
- name: project-id
in: path
description: project identifier
required: true
schema:
type: string
- name: jobset-id
in: path
description: jobset identifier
required: true
schema:
type: string
- name: job-id
in: path
description: job identifier
required: true
schema:
type: string
- name: system
in: path
description: system
required: true
schema:
type: string
example: x86_64-linux
responses:
'302':
description: the build to redirect to
headers:
Location:
example: /build/1
schema:
type: string
/job/{project-id}/{jobset-id}/{job-id}/latest-finished:
get:
summary: Redirects to the latest succesful build for a job from a finished evaluation
parameters:
- name: project-id
in: path
description: project identifier
required: true
schema:
type: string
- name: jobset-id
in: path
description: jobset identifier
required: true
schema:
type: string
- name: job-id
in: path
description: job identifier
required: true
schema:
type: string
responses:
'302':
description: the build to redirect to
headers:
Location:
example: /build/1
schema:
type: string
components:
schemas:

View File

@@ -4,7 +4,7 @@
hydra = { pkgs, lib,... }: {
_file = ./default.nix;
imports = [ ./hydra.nix ];
services.hydra-dev.package = lib.mkDefault self.packages.${pkgs.hostPlatform.system}.hydra;
services.hydra-dev.package = lib.mkDefault self.packages.${pkgs.stdenv.hostPlatform.system}.hydra;
};
hydraTest = { pkgs, ... }: {

View File

@@ -31,7 +31,7 @@
, perl
, pixz
, boost
, postgresql_13
, postgresql_17
, nlohmann_json
, prometheus-cpp
@@ -192,7 +192,7 @@ stdenv.mkDerivation (finalAttrs: {
subversion
breezy
openldap
postgresql_13
postgresql_17
pixz
nix-eval-jobs
];

View File

@@ -104,9 +104,9 @@ static void copyClosureTo(
std::unique_lock<std::timed_mutex> sendLock(conn.machine->state->sendLock,
std::chrono::seconds(600));
conn.importPaths(destStore, [&](Sink & sink) {
exportPaths(destStore, missing, sink);
});
conn.to << ServeProto::Command::ImportPaths;
exportPaths(destStore, missing, conn.to);
conn.to.flush();
if (readInt(conn.from) != 1)
throw Error("remote machine failed to import closure");
@@ -273,7 +273,7 @@ static BuildResult performBuild(
auto drvOutput = DrvOutput { outputHash, outputName };
successP->builtOutputs.insert_or_assign(
std::move(outputName),
Realisation { drvOutput, *outputPath });
Realisation { {.outPath = *outputPath}, drvOutput });
}
}
}
@@ -301,10 +301,11 @@ static void copyPathFromRemote(
lambda function only gets executed if someone tries to read
from source2, we will send the command from here rather
than outside the lambda. */
conn.narFromPath(localStore, info.path, [&](Source & source) {
TeeSource tee(source, sink);
extractNarData(tee, localStore.printStorePath(info.path), narMembers);
});
conn.to << ServeProto::Command::DumpStorePath << localStore.printStorePath(info.path);
conn.to.flush();
TeeSource tee(conn.from, sink);
extractNarData(tee, localStore.printStorePath(info.path), narMembers);
});
destStore.addToStore(info, *source2, NoRepair, NoCheckSigs);

View File

@@ -537,12 +537,12 @@ void State::notifyBuildFinished(pqxx::work & txn, BuildID buildId,
std::shared_ptr<PathLocks> State::acquireGlobalLock()
{
Path lockPath = hydraData + "/queue-runner/lock";
auto lockPath = std::filesystem::path(hydraData) / "queue-runner/lock";
createDirs(dirOf(lockPath));
createDirs(lockPath.parent_path());
auto lock = std::make_shared<PathLocks>();
if (!lock->lockPaths(PathSet({lockPath}), "", false)) return 0;
if (!lock->lockPaths({lockPath}, "", false)) return 0;
return lock;
}

View File

@@ -1,5 +1,6 @@
#include "state.hh"
#include "hydra-build-result.hh"
#include <nix/store/derived-path.hh>
#include <nix/store/globals.hh>
#include <nix/store/parsed-derivations.hh>
#include <nix/util/thread-pool.hh>
@@ -487,24 +488,24 @@ Step::ptr State::createStep(ref<Store> destStore,
it's not runnable yet, and other threads won't make it
runnable while step->created == false. */
step->drv = std::make_unique<Derivation>(localStore->readDerivation(drvPath));
{
try {
step->drvOptions = std::make_unique<DerivationOptions>(
DerivationOptions::fromStructuredAttrs(
step->drv->env,
step->drv->structuredAttrs ? &*step->drv->structuredAttrs : nullptr));
} catch (Error & e) {
e.addTrace({}, "while parsing derivation '%s'", localStore->printStorePath(drvPath));
throw;
}
DerivationOptions<nix::SingleDerivedPath> drvOptions;
try {
drvOptions = derivationOptionsFromStructuredAttrs(
*localStore,
step->drv->inputDrvs,
step->drv->env,
get(step->drv->structuredAttrs));
} catch (Error & e) {
e.addTrace({}, "while parsing derivation '%s'", localStore->printStorePath(drvPath));
throw;
}
step->preferLocalBuild = step->drvOptions->willBuildLocally(*localStore, *step->drv);
step->preferLocalBuild = drvOptions.willBuildLocally(*localStore, *step->drv);
step->isDeterministic = getOr(step->drv->env, "isDetermistic", "0") == "1";
step->systemType = step->drv->platform;
{
StringSet features = step->requiredSystemFeatures = step->drvOptions->getRequiredSystemFeatures(*step->drv);
StringSet features = step->requiredSystemFeatures = drvOptions.getRequiredSystemFeatures(*step->drv);
if (step->preferLocalBuild)
features.insert("local");
if (!features.empty()) {

View File

@@ -172,7 +172,6 @@ struct Step
nix::StorePath drvPath;
std::unique_ptr<nix::Derivation> drv;
std::unique_ptr<nix::DerivationOptions> drvOptions;
nix::StringSet requiredSystemFeatures;
bool preferLocalBuild;
bool isDeterministic;

View File

@@ -220,11 +220,11 @@ sub scmdiff : Path('/api/scmdiff') Args(0) {
my $clonePath = getSCMCacheDir . "/git/" . sha256_hex($uri);
die if ! -d $clonePath;
my ($stdout1, $stderr1);
run3(['git', '-C', $clonePath, 'log', "$rev1..$rev2"], \undef, \$stdout1, \$stderr1);
run3(['git', '--git-dir', '.git', '-C', $clonePath, 'log', "$rev1..$rev2"], \undef, \$stdout1, \$stderr1);
$diff .= $stdout1 if $? == 0;
my ($stdout2, $stderr2);
run3(['git', '-C', $clonePath, 'diff', "$rev1..$rev2"], \undef, \$stdout2, \$stderr2);
run3(['git', '--git-dir', '.git', '-C', $clonePath, 'diff', "$rev1..$rev2"], \undef, \$stdout2, \$stderr2);
$diff .= $stdout2 if $? == 0;
}

View File

@@ -106,11 +106,11 @@ sub doEmailLogin {
my $allowed_domains = $c->config->{allowed_domains} // ($c->config->{persona_allowed_domains} // "");
if ($allowed_domains ne "") {
my $email_ok = 0;
my @domains = split ',', $allowed_domains;
my @domains = split /,/, $allowed_domains;
map { $_ =~ s/^\s*(.*?)\s*$/$1/ } @domains;
foreach my $domain (@domains) {
$email_ok = $email_ok || ((split '@', $email)[1] eq $domain);
$email_ok = $email_ok || ((split /@/, $email)[1] eq $domain);
}
error($c, "Your email address does not belong to a domain that is allowed to log in.\n")
unless $email_ok;

View File

@@ -71,7 +71,7 @@ sub buildFinished {
my $to = $build->jobset->emailoverride ne "" ? $build->jobset->emailoverride : $build->maintainers;
foreach my $address (split ",", ($to // "")) {
foreach my $address (split /,/, ($to // "")) {
$address = trim $address;
$addresses{$address} //= { builds => [] };

View File

@@ -38,7 +38,7 @@ sub _parseValue {
$start_options = 2;
}
foreach my $option (@parts[$start_options .. $#parts]) {
(my $key, my $value) = split('=', $option);
(my $key, my $value) = split(/=/, $option);
$options->{$key} = $value;
}
return ($uri, $branch, $deepClone, $options);
@@ -265,7 +265,7 @@ sub getCommits {
my $res = [];
foreach my $line (split /\n/, $out) {
my ($revision, $author, $email, $date) = split "\t", $line;
my ($revision, $author, $email, $date) = split /\t/, $line;
push @$res, { revision => $revision, author => decode("utf-8", $author), email => $email };
}

View File

@@ -63,9 +63,9 @@ sub common {
my $accessToken = $self->{config}->{gitea_authorization}->{$repoOwner};
my $rev = $i->revision;
my $domain = URI->new($i->uri)->host;
my $host;
unless (defined $gitea_url) {
my $domain = URI->new($i->uri)->host;
$host = "https://$domain";
} else {
$host = $gitea_url->value;

View File

@@ -31,10 +31,10 @@ sub _iterate {
$pulls->{$pull->{number}} = $pull;
}
# TODO Make Link header parsing more robust!!!
my @links = split ',', ($res->header("Link") // "");
my @links = split /,/, ($res->header("Link") // "");
my $next = "";
foreach my $link (@links) {
my ($url, $rel) = split ";", $link;
my ($url, $rel) = split /;/, $link;
if (trim($rel) eq 'rel="next"') {
$next = substr trim($url), 1, -1;
last;

View File

@@ -18,9 +18,8 @@ tags) from GitHub following a certain naming scheme
=head1 DESCRIPTION
This plugin reads the list of branches or tags using GitHub's REST API. The name
of the reference must follow a particular prefix. This list is stored in the
nix-store and used as an input to declarative jobsets.
This plugin reads the list of branches or tags using GitHub's REST API. This
list is stored in the nix-store and used as an input to declarative jobsets.
=head1 CONFIGURATION
@@ -34,7 +33,7 @@ The declarative project C<spec.json> file must contains an input such as
"pulls": {
"type": "github_refs",
"value": "[owner] [repo] heads|tags - [prefix]",
"value": "[owner] [repo] [type] - [prefix]",
"emailresponsible": false
}
@@ -42,12 +41,11 @@ In the above snippet, C<[owner]> is the repository owner and C<[repo]> is the
repository name. Also note a literal C<->, which is placed there for the future
use.
C<heads|tags> denotes that one of these two is allowed, that is, the third
position should hold either the C<heads> or the C<tags> keyword. In case of the former, the plugin
will fetch all branches, while in case of the latter, it will fetch the tags.
C<[type]> is the type of ref to list. Typical values are "heads", "tags", and
"pull". "." will include all types.
C<prefix> denotes the prefix the reference name must start with, in order to be
included.
included. "." will include all references.
For example, C<"value": "nixos hydra heads - release/"> refers to
L<https://github.com/nixos/hydra> repository, and will fetch all branches that
@@ -85,10 +83,10 @@ sub _iterate {
$refs->{$ref_name} = $ref;
}
# TODO Make Link header parsing more robust!!!
my @links = split ',', $res->header("Link");
my @links = split /,/, $res->header("Link");
my $next = "";
foreach my $link (@links) {
my ($url, $rel) = split ";", $link;
my ($url, $rel) = split /;/, $link;
if (trim($rel) eq 'rel="next"') {
$next = substr trim($url), 1, -1;
last;
@@ -102,8 +100,6 @@ sub fetchInput {
return undef if $input_type ne "github_refs";
my ($owner, $repo, $type, $fut, $prefix) = split ' ', $value;
die "type field is neither 'heads' nor 'tags', but '$type'"
unless $type eq 'heads' or $type eq 'tags';
my $auth = $self->{config}->{github_authorization}->{$owner};
my $githubEndpoint = $self->{config}->{github_endpoint} // "https://api.github.com";

View File

@@ -49,10 +49,10 @@ sub _iterate {
$pulls->{$pull->{iid}} = $pull;
}
# TODO Make Link header parsing more robust!!!
my @links = split ',', $res->header("Link");
my @links = split /,/, $res->header("Link");
my $next = "";
foreach my $link (@links) {
my ($url, $rel) = split ";", $link;
my ($url, $rel) = split /;/, $link;
if (trim($rel) eq 'rel="next"') {
$next = substr trim($url), 1, -1;
last;
@@ -84,7 +84,7 @@ sub fetchInput {
my $tempdir = File::Temp->newdir("gitlab-pulls" . "XXXXX", TMPDIR => 1);
my $filename = "$tempdir/gitlab-pulls-sorted.json";
open(my $fh, ">", $filename) or die "Cannot open $filename for writing: $!";
print $fh JSON::MaybeXS->new(canonical => 1, pretty => 1)->encode(\%pulls);
print $fh JSON::MaybeXS->new(canonical => 1, pretty => 1, utf8 => 1)->encode(\%pulls);
close $fh;
my $storePath = addToStore($filename);
my $timestamp = time;

View File

@@ -126,7 +126,7 @@ sub getCommits {
my $res = [];
foreach my $line (split /\n/, $out) {
if ($line ne "") {
my ($revision, $author, $email) = split "\t", $line;
my ($revision, $author, $email) = split /\t/, $line;
push @$res, { revision => $revision, author => $author, email => $email };
}
}

View File

@@ -12,19 +12,31 @@ sub supportedInputTypes {
$inputTypes->{'path'} = 'Local path or URL';
}
sub _parseValue {
# The input is a local path or URL, optionally followed by a
# time period specified in seconds.
my ($config, $value) = @_;
my @parts = split ' ', $value;
(my $uri, my $freq) = @parts;
# By default don't check a path more often than every 30 seconds,
# but the second path argument can change that value or the global
# path_input_cache_validity_seconds configuration, in that order.
my $timeout = defined $freq ? $freq : ($config->{path_input_cache_validity_seconds} // 30);
return ($uri, $timeout);
}
sub fetchInput {
my ($self, $type, $name, $value) = @_;
return undef if $type ne "path";
my $uri = $value;
my ($uri, $timeout) = _parseValue($self->{config}, $value);
my $timestamp = time;
my $sha256;
my $storePath;
my $timeout = $self->{config}->{path_input_cache_validity_seconds} // 30;
# Some simple caching: don't check a path more than once every N seconds.
(my $cachedInput) = $self->{db}->resultset('CachedPathInputs')->search(
{srcpath => $uri, lastseen => {">", $timestamp - $timeout}},

View File

@@ -85,7 +85,7 @@ sub isBuildEligibleForDynamicRunCommand {
sub configSectionMatches {
my ($name, $project, $jobset, $job) = @_;
my @elems = split ':', $name;
my @elems = split /:/, $name;
die "invalid section name '$name'\n" if scalar(@elems) > 3;

View File

@@ -563,7 +563,7 @@ makeQueries('', "");
makeQueries('ForProject', "and jobset_id in (select id from jobsets j where j.project = ?)");
makeQueries('ForJobset', "and jobset_id = ?");
makeQueries('ForJob', "and jobset_id = ? and job = ?");
makeQueries('ForJobName', "and jobset_id = (select id from jobsets j where j.name = ?) and job = ?");
makeQueries('ForJobName', "and jobset_id = (select id from jobsets j where j.project = ? and j.name = ?) and job = ?");
sub as_json {
my ($self) = @_;

View File

@@ -66,6 +66,11 @@ __PACKAGE__->table("jobsetevalinputs");
data_type: 'text'
is_nullable: 1
=head2 shortRevLength
data_type: 'number'
is_nullable: 1
=head2 value
data_type: 'text'
@@ -102,6 +107,8 @@ __PACKAGE__->add_columns(
{ data_type => "text", is_nullable => 1 },
"revision",
{ data_type => "text", is_nullable => 1 },
"shortRevLength",
{ data_type => "integer", is_nullable => 1 },
"value",
{ data_type => "text", is_nullable => 1 },
"dependency",
@@ -183,4 +190,28 @@ sub json_hint {
return \%hint;
}
# Revision to be rendered by the frontend
sub frontend_revision() {
my ($self) = @_;
my $type = $self->get_column('type');
if ($type eq 'svn' or $type eq 'svn-checkout' or $type eq 'bzr' or $type eq 'bzr-checkout') {
return 'r' . $self->get_column('revision');
} elsif ($type eq 'git') {
# Find the longest revision length of this URI
my $schema = $self->result_source->schema;
my $maxLength = $schema
->resultset('JobsetEvalInputs')
->search({ uri => $self->get_column('uri')})
->get_column('shortRevLength')
->max;
# Fall back to a fixed value if there was no value
return substr($self->get_column('revision'), 0, $maxLength || 12);
} elsif ($type eq 'bzr') {
return substr($self->get_column('revision'), 0, 12);
} else {
return $self->get_column('revision');
}
}
1;

View File

@@ -568,7 +568,7 @@ END;
running the following command:</p>
<div class="card bg-light"><div class="card-body p-2"><code>
<span class="shell-prompt"># </span>nix build [% HTML.escape(eval.flake) %]#hydraJobs.[% HTML.escape(job) %]
<span class="shell-prompt"># </span>nix build '[% HTML.escape(eval.flake) %]#hydraJobs.[% HTML.escape(job) %]'
</code></div></div>
[% ELSE %]

View File

@@ -347,13 +347,24 @@ BLOCK renderDiffUri;
url = res.0;
branch = res.1;
IF bi1.type == "hg" || bi1.type == "git" %]
<a target="_blank" [% HTML.attributes(href => c.uri_for('/api/scmdiff', {
uri = url,
rev1 = bi1.revision,
rev2 = bi2.revision,
type = bi1.type,
branch = branch
})) %]>[% HTML.escape(contents) %]</a>
[% IF url.substr(0, 19) == "https://github.com/";
github_url = url.replace('\.git$', '') %]
<a target="_blank" [% HTML.attributes(href =>
github_url
_ "/compare/"
_ bi1.revision
_ "..."
_ bi2.revision,
) %]>[% HTML.escape(contents) %]</a>
[% ELSE %]
<a target="_blank" [% HTML.attributes(href => c.uri_for('/api/scmdiff', {
uri = url,
rev1 = bi1.revision,
rev2 = bi2.revision,
type = bi1.type,
branch = branch
})) %]>[% HTML.escape(contents) %]</a>
[% END %]
[% ELSE;
contents;
END;
@@ -411,7 +422,7 @@ BLOCK renderInputDiff; %]
[% ELSIF bi1.uri == bi2.uri && bi1.revision != bi2.revision %]
[% IF bi1.type == "git" %]
<tr><td>
<b>[% HTML.escape(bi1.name) %]</b></td><td><tt>[% INCLUDE renderDiffUri contents=(bi1.revision.substr(0, 12) _ ' to ' _ bi2.revision.substr(0, 12)) %]</tt>
<b>[% HTML.escape(bi1.name) %]</b></td><td><tt>[% INCLUDE renderDiffUri contents=(bi1.frontend_revision _ ' to ' _ bi2.frontend_revision) %]</tt>
</td></tr>
[% ELSE %]
<tr><td>
@@ -452,16 +463,10 @@ BLOCK renderPager %]
BLOCK renderShortEvalInput;
IF input.type == "svn" || input.type == "svn-checkout" || input.type == "bzr" || input.type == "bzr-checkout" %]
r[% input.revision %]
[% ELSIF input.type == "git" %]
<tt>[% input.revision.substr(0, 7) | html %]</tt>
[% ELSIF input.type == "hg" %]
<tt>[% input.revision.substr(0, 12) | html %]</tt>
[% ELSIF input.type == "build" || input.type == "sysbuild" %]
IF input.type == "build" || input.type == "sysbuild" %]
<a [% HTML.attributes(href => c.uri_for('/build' input.get_column('dependency'))) %]>[% HTML.escape(input.get_column('dependency')) %]</a>
[% ELSE %]
<tt>[% input.revision | html %]</tt>
<tt>[% input.frontend_revision | html %]</tt>
[% END;
END;

View File

@@ -9,8 +9,8 @@
[% INCLUDE includeFlot %]
[% IF !jobExists(jobset, job) %]
<div class="alert alert-warning">This job is not a member of the <a
[% HTML.attributes(href => c.uri_for('/jobset' project.name jobset.name
<div class="alert alert-warning">This job is not a member of the
<a [% HTML.attributes(href => c.uri_for('/jobset' project.name jobset.name
'evals')) %]>latest evaluation</a> of its jobset. This means it was
removed or had an evaluation error.</div>
[% END %]

View File

@@ -117,7 +117,7 @@ else
revCount="$(cat "$tmpDir/[% input.name %]/rev-count")"
fi
args+=(--arg '[% input.name %]' "{ outPath = $inputDir; rev = \"[% input.revision %]\"; shortRev = \"[% input.revision.substr(0, 7) %]\"; revCount = $revCount; }")
args+=(--arg '[% input.name %]' "{ outPath = $inputDir; rev = \"[% input.revision %]\"; shortRev = \"[% input.frontend_revision %]\"; revCount = $revCount; }")
[%+ ELSIF input.type == "hg" %]

View File

@@ -160,7 +160,7 @@ sub fetchInputSystemBuild {
$jobsetName ||= $jobset->name;
my @latestBuilds = $db->resultset('LatestSucceededForJobName')
->search({}, {bind => [$jobsetName, $jobName]});
->search({}, {bind => [$projectName, $jobsetName, $jobName]});
my @validBuilds = ();
foreach my $build (@latestBuilds) {
@@ -891,6 +891,7 @@ sub checkJobsetWrapped {
, type => $input->{type}
, uri => $input->{uri}
, revision => $input->{revision}
, shortRevLength => length($input->{shortRev})
, value => $input->{value}
, dependency => $input->{id}
, path => $input->{storePath} || "" # !!! temporary hack

View File

@@ -487,11 +487,12 @@ create table JobsetEvalInputs (
altNr integer not null,
-- Copied from the jobsetinputs from which the build was created.
type text not null,
uri text,
revision text,
value text,
dependency integer, -- build ID of the input, for type == 'build'
type text not null,
uri text,
revision text,
shortRevLength smallint, -- length of a short revision at the time this was checked out
value text,
dependency integer, -- build ID of the input, for type == 'build'
path text,

View File

@@ -1,90 +1,7 @@
sql_files = files(
'hydra.sql',
'test.sql',
'update-dbix.pl',
'upgrade-2.sql',
'upgrade-3.sql',
'upgrade-4.sql',
'upgrade-5.sql',
'upgrade-6.sql',
'upgrade-7.sql',
'upgrade-8.sql',
'upgrade-9.sql',
'upgrade-10.sql',
'upgrade-11.sql',
'upgrade-12.sql',
'upgrade-13.sql',
'upgrade-14.sql',
'upgrade-15.sql',
'upgrade-16.sql',
'upgrade-17.sql',
'upgrade-18.sql',
'upgrade-19.sql',
'upgrade-20.sql',
'upgrade-21.sql',
'upgrade-22.sql',
'upgrade-23.sql',
'upgrade-24.sql',
'upgrade-25.sql',
'upgrade-26.sql',
'upgrade-27.sql',
'upgrade-28.sql',
'upgrade-29.sql',
'upgrade-30.sql',
'upgrade-31.sql',
'upgrade-32.sql',
'upgrade-33.sql',
'upgrade-34.sql',
'upgrade-35.sql',
'upgrade-36.sql',
'upgrade-37.sql',
'upgrade-38.sql',
'upgrade-39.sql',
'upgrade-40.sql',
'upgrade-41.sql',
'upgrade-42.sql',
'upgrade-43.sql',
'upgrade-44.sql',
'upgrade-45.sql',
'upgrade-46.sql',
'upgrade-47.sql',
'upgrade-48.sql',
'upgrade-49.sql',
'upgrade-50.sql',
'upgrade-51.sql',
'upgrade-52.sql',
'upgrade-53.sql',
'upgrade-54.sql',
'upgrade-55.sql',
'upgrade-56.sql',
'upgrade-57.sql',
'upgrade-58.sql',
'upgrade-59.sql',
'upgrade-60.sql',
'upgrade-61.sql',
'upgrade-62.sql',
'upgrade-63.sql',
'upgrade-64.sql',
'upgrade-65.sql',
'upgrade-66.sql',
'upgrade-67.sql',
'upgrade-68.sql',
'upgrade-69.sql',
'upgrade-70.sql',
'upgrade-71.sql',
'upgrade-72.sql',
'upgrade-73.sql',
'upgrade-74.sql',
'upgrade-75.sql',
'upgrade-76.sql',
'upgrade-77.sql',
'upgrade-78.sql',
'upgrade-79.sql',
'upgrade-80.sql',
'upgrade-81.sql',
'upgrade-82.sql',
'upgrade-83.sql',
'upgrade-84.sql',
# Install all SQL files in this directory.
# This includes hydra.sql, test.sql, update-dbix.pl, and all upgrade-*.sql files.
install_subdir('.',
install_dir: hydra_libexecdir / 'sql',
strip_directory: true,
exclude_files: ['meson.build', 'update-dbix-harness.sh'],
)
install_data(sql_files, install_dir: hydra_libexecdir / 'sql')

1
src/sql/upgrade-85.sql Normal file
View File

@@ -0,0 +1 @@
ALTER TABLE JobsetEvalInputs ADD COLUMN shortRevLength smallint;

View File

@@ -109,7 +109,7 @@ subtest "Build: not substitutable, unsubstitutable" => sub {
subtest "Second notification: step_finished" => sub {
my ($channelName, $pid, $payload) = @{$dbh->func("pg_notifies")};
is($channelName, "step_finished", "The event is for the step finishing");
my ($buildId, $stepNr, $logFile) = split "\t", $payload;
my ($buildId, $stepNr, $logFile) = split /\t/, $payload;
is($buildId, $build->id, "The payload is the build's ID");
is($stepNr, 1, "The payload is the build's step number");
isnt($logFile, undef, "The log file is passed");