Merge master into staging-next

authored by

nixpkgs-ci[bot] and committed by
GitHub
13b34c40 a8600a49

+369 -62
+51 -11
doc/stdenv/stdenv.chapter.md
··· 327 328 To determine the exact rules for dependency propagation, we start by assigning to each dependency a couple of ternary numbers (`-1` for `build`, `0` for `host`, and `1` for `target`) representing its [dependency type](#possible-dependency-types), which captures how its host and target platforms are each "offset" from the depending derivation’s host and target platforms. The following table summarize the different combinations that can be obtained: 329 330 - | `host → target` | attribute name | offset | 331 - | ------------------- | ------------------- | -------- | 332 - | `build --> build` | `depsBuildBuild` | `-1, -1` | 333 - | `build --> host` | `nativeBuildInputs` | `-1, 0` | 334 - | `build --> target` | `depsBuildTarget` | `-1, 1` | 335 - | `host --> host` | `depsHostHost` | `0, 0` | 336 - | `host --> target` | `buildInputs` | `0, 1` | 337 - | `target --> target` | `depsTargetTarget` | `1, 1` | 338 339 Algorithmically, we traverse propagated inputs, accumulating every propagated dependency’s propagated dependencies and adjusting them to account for the “shift in perspective” described by the current dependency’s platform offsets. This results is sort of a transitive closure of the dependency relation, with the offsets being approximately summed when two dependency links are combined. We also prune transitive dependencies whose combined offsets go out-of-bounds, which can be viewed as a filter over that transitive closure removing dependencies that are blatantly absurd. 340 341 - We can define the process precisely with [Natural Deduction](https://en.wikipedia.org/wiki/Natural_deduction) using the inference rules. This probably seems a bit obtuse, but so is the bash code that actually implements it! [^footnote-stdenv-find-inputs-location] They’re confusing in very different ways so… hopefully if something doesn’t make sense in one presentation, it will in the other! 342 343 ``` 344 let mapOffset(h, t, i) = i + (if i <= 0 then h else t - 1) ··· 372 dep(h, t, A, B) 373 ``` 374 375 - Some explanation of this monstrosity is in order. In the common case, the target offset of a dependency is the successor to the host offset: `t = h + 1`. That means that: 376 377 ``` 378 let f(h, t, i) = i + (if i <= 0 then h else t - 1) ··· 383 384 This is where “sum-like” comes in from above: We can just sum all of the host offsets to get the host offset of the transitive dependency. The target offset is the transitive dependency is the host offset + 1, just as it was with the dependencies composed to make this transitive one; it can be ignored as it doesn’t add any new information. 385 386 - Because of the bounds checks, the uncommon cases are `h = t` and `h + 2 = t`. In the former case, the motivation for `mapOffset` is that since its host and target platforms are the same, no transitive dependency of it should be able to “discover” an offset greater than its reduced target offsets. `mapOffset` effectively “squashes” all its transitive dependencies’ offsets so that none will ever be greater than the target offset of the original `h = t` package. In the other case, `h + 1` is skipped over between the host and target offsets. Instead of squashing the offsets, we need to “rip” them apart so no transitive dependencies’ offset is that one. 387 388 Overall, the unifying theme here is that propagation shouldn’t be introducing transitive dependencies involving platforms the depending package is unaware of. \[One can imagine the depending package asking for dependencies with the platforms it knows about; other platforms it doesn’t know how to ask for. The platform description in that scenario is a kind of unforgeable capability.\] The offset bounds checking and definition of `mapOffset` together ensure that this is the case. Discovering a new offset is discovering a new platform, and since those platforms weren’t in the derivation “spec” of the needing package, they cannot be relevant. From a capability perspective, we can imagine that the host and target platforms of a package are the capabilities a package requires, and the depending package must provide the capability to the dependency. 389
··· 327 328 To determine the exact rules for dependency propagation, we start by assigning to each dependency a couple of ternary numbers (`-1` for `build`, `0` for `host`, and `1` for `target`) representing its [dependency type](#possible-dependency-types), which captures how its host and target platforms are each "offset" from the depending derivation’s host and target platforms. The following table summarize the different combinations that can be obtained: 329 330 + | `host → target` | attribute name | offset | typical purpose | 331 + | ------------------- | ------------------- | -------- | --------------------------------------------- | 332 + | `build --> build` | `depsBuildBuild` | `-1, -1` | compilers for build helpers | 333 + | `build --> host` | `nativeBuildInputs` | `-1, 0` | build tools, compilers, setup hooks | 334 + | `build --> target` | `depsBuildTarget` | `-1, 1` | compilers to build stdlibs to run on target | 335 + | `host --> host` | `depsHostHost` | `0, 0` | compilers to build C code at runtime (rare) | 336 + | `host --> target` | `buildInputs` | `0, 1` | libraries | 337 + | `target --> target` | `depsTargetTarget` | `1, 1` | stdlibs to run on target | 338 339 Algorithmically, we traverse propagated inputs, accumulating every propagated dependency’s propagated dependencies and adjusting them to account for the “shift in perspective” described by the current dependency’s platform offsets. This results is sort of a transitive closure of the dependency relation, with the offsets being approximately summed when two dependency links are combined. We also prune transitive dependencies whose combined offsets go out-of-bounds, which can be viewed as a filter over that transitive closure removing dependencies that are blatantly absurd. 340 341 + We can define the process precisely with [Natural Deduction](https://en.wikipedia.org/wiki/Natural_deduction) using the inference rules below. This probably seems a bit obtuse, but so is the bash code that actually implements it! [^footnote-stdenv-find-inputs-location] They’re confusing in very different ways so… hopefully if something doesn’t make sense in one presentation, it will in the other! 342 + 343 + **Definitions:** 344 + 345 + `dep(h_offset, t_offset, X, Y)` 346 + : Package X has a direct dependency on Y in a position with host offset `h_offset` and target offset `t_offset`. 347 + 348 + For example, `nativeBuildInputs = [ Y ]` means `dep(-1, 0, X, Y)`. 349 + 350 + `propagated-dep(h_offset, t_offset, X, Y)` 351 + : Package X has a propagated dependency on Y in a position with host offset `h_offset` and target offset `t_offset`. 352 + 353 + For example, `depsBuildTargetPropagated = [ Y ]` means `propagated-dep(-1, 1, X, Y)`. 354 + 355 + `mapOffset(h, t, i) = offs` 356 + : In a package X with a dependency on Y in a position with host offset `h` and target offset `t`, Y's transitive dependency Z in a position with offset `i` is mapped to offset `offs` in X. 357 + 358 + 359 + ::: {.example} 360 + # Truth table of `mapOffset(h, t, i)` 361 + 362 + `x` means that the dependency was discarded because `h + i ∉ {-1, 0, 1}`. 363 + 364 + <!-- This is written as an ascii art table because the CSS was introducing so much space it was unreadable and doesn't support double lines --> 365 + 366 + ``` 367 + h | t || i=-1 | i=0 | i=1 368 + ----|------||------|------|----- 369 + -1 | -1 || x | -1 | -1 370 + -1 | 0 || x | -1 | 0 371 + -1 | 1 || x | -1 | 1 372 + 0 | 0 || -1 | 0 | 0 373 + 0 | 1 || -1 | 0 | 1 374 + 1 | 1 || 0 | 1 | x 375 + ``` 376 + 377 + ::: 378 379 ``` 380 let mapOffset(h, t, i) = i + (if i <= 0 then h else t - 1) ··· 408 dep(h, t, A, B) 409 ``` 410 411 + Some explanation of this monstrosity is in order. In the common case of `nativeBuildInputs` or `buildInputs`, the target offset of a dependency is one greater than the host offset: `t = h + 1`. That means that: 412 413 ``` 414 let f(h, t, i) = i + (if i <= 0 then h else t - 1) ··· 419 420 This is where “sum-like” comes in from above: We can just sum all of the host offsets to get the host offset of the transitive dependency. The target offset is the transitive dependency is the host offset + 1, just as it was with the dependencies composed to make this transitive one; it can be ignored as it doesn’t add any new information. 421 422 + Because of the bounds checks, the uncommon cases are `h = t` (`depsBuildBuild`, etc) and `h + 2 = t` (`depsBuildTarget`). 423 + 424 + In the former case, the motivation for `mapOffset` is that since its host and target platforms are the same, no transitive dependency of it should be able to “discover” an offset greater than its reduced target offsets. `mapOffset` effectively “squashes” all its transitive dependencies’ offsets so that none will ever be greater than the target offset of the original `h = t` package. 425 + 426 + In the other case, `h + 1` (0) is skipped over between the host (-1) and target (1) offsets. Instead of squashing the offsets, we need to “rip” them apart so no transitive dependency’s offset is 0. 427 428 Overall, the unifying theme here is that propagation shouldn’t be introducing transitive dependencies involving platforms the depending package is unaware of. \[One can imagine the depending package asking for dependencies with the platforms it knows about; other platforms it doesn’t know how to ask for. The platform description in that scenario is a kind of unforgeable capability.\] The offset bounds checking and definition of `mapOffset` together ensure that this is the case. Discovering a new offset is discovering a new platform, and since those platforms weren’t in the derivation “spec” of the needing package, they cannot be relevant. From a capability perspective, we can imagine that the host and target platforms of a package are the capabilities a package requires, and the depending package must provide the capability to the dependency. 429
+12
maintainers/maintainer-list.nix
··· 7840 name = "Elis Hirwing"; 7841 keys = [ { fingerprint = "67FE 98F2 8C44 CF22 1828 E12F D57E FA62 5C9A 925F"; } ]; 7842 }; 7843 euank = { 7844 email = "euank-nixpkg@euank.com"; 7845 github = "euank"; ··· 10945 github = "iosmanthus"; 10946 githubId = 16307070; 10947 name = "iosmanthus"; 10948 }; 10949 iquerejeta = { 10950 github = "iquerejeta";
··· 7840 name = "Elis Hirwing"; 7841 keys = [ { fingerprint = "67FE 98F2 8C44 CF22 1828 E12F D57E FA62 5C9A 925F"; } ]; 7842 }; 7843 + eu90h = { 7844 + email = "stefan@eu90h.com"; 7845 + github = "eu90h"; 7846 + githubId = 5161785; 7847 + name = "Stefan"; 7848 + }; 7849 euank = { 7850 email = "euank-nixpkg@euank.com"; 7851 github = "euank"; ··· 10951 github = "iosmanthus"; 10952 githubId = 16307070; 10953 name = "iosmanthus"; 10954 + }; 10955 + iqubic = { 10956 + email = "sophia.b.caspe@gmail.com"; 10957 + github = "iqubic"; 10958 + githubId = 22628816; 10959 + name = "Sophia Caspe"; 10960 }; 10961 iquerejeta = { 10962 github = "iquerejeta";
+2 -2
pkgs/applications/graphics/ImageMagick/default.nix
··· 85 86 stdenv.mkDerivation (finalAttrs: { 87 pname = "imagemagick"; 88 - version = "7.1.1-47"; 89 90 src = fetchFromGitHub { 91 owner = "ImageMagick"; 92 repo = "ImageMagick"; 93 tag = finalAttrs.version; 94 - hash = "sha256-lRPGVGv86vH7Q1cLoLp8mOAkxcHTHgUrx0mmKgl1oEc="; 95 }; 96 97 outputs = [
··· 85 86 stdenv.mkDerivation (finalAttrs: { 87 pname = "imagemagick"; 88 + version = "7.1.2-0"; 89 90 src = fetchFromGitHub { 91 owner = "ImageMagick"; 92 repo = "ImageMagick"; 93 tag = finalAttrs.version; 94 + hash = "sha256-4x0+yELmXstv9hPuwzMGcKiTa1rZtURZgwSSVIhzAkE="; 95 }; 96 97 outputs = [
+14 -14
pkgs/applications/networking/browsers/chromium/info.json
··· 1 { 2 "chromium": { 3 - "version": "138.0.7204.100", 4 "chromedriver": { 5 - "version": "138.0.7204.101", 6 - "hash_darwin": "sha256-ow+R2jcfm5tryB6UfnUNklVfLGc2Tzj2W6Nul6pRglI=", 7 - "hash_darwin_aarch64": "sha256-GGcDoSkH8Z4N8yOL77nNMtz3BY4lNwlD10SPhEBRpJI=" 8 }, 9 "deps": { 10 "depot_tools": { ··· 20 "DEPS": { 21 "src": { 22 "url": "https://chromium.googlesource.com/chromium/src.git", 23 - "rev": "5f45b4744e3d5ba82c2ca6d942f1e7a516110752", 24 - "hash": "sha256-bI75IXPl6YeauK2oTnUURh1ch1H7KKw/QzKYZ/q6htI=", 25 "recompress": true 26 }, 27 "src/third_party/clang-format/script": { ··· 96 }, 97 "src/third_party/angle": { 98 "url": "https://chromium.googlesource.com/angle/angle.git", 99 - "rev": "df15136b959fc60c230265f75ee7fc75c96e8250", 100 - "hash": "sha256-b4bGxhtrsfmVdJo/5QT4/mtQ6hqxmfpmcrieqaT9/ls=" 101 }, 102 "src/third_party/angle/third_party/glmark2/src": { 103 "url": "https://chromium.googlesource.com/external/github.com/glmark2/glmark2", ··· 131 }, 132 "src/third_party/dawn": { 133 "url": "https://dawn.googlesource.com/dawn.git", 134 - "rev": "86772f20cca54b46f62b65ece1ef61224aef09db", 135 - "hash": "sha256-N9DVbQE56WWBmJ/PJlYhU+pr8I+PFf/7FzMLCNqx3hg=" 136 }, 137 "src/third_party/dawn/third_party/glfw": { 138 "url": "https://chromium.googlesource.com/external/github.com/glfw/glfw", ··· 246 }, 247 "src/third_party/devtools-frontend/src": { 248 "url": "https://chromium.googlesource.com/devtools/devtools-frontend", 249 - "rev": "a6dbe06dafbad00ef4b0ea139ece1a94a5e2e6d8", 250 - "hash": "sha256-XkyJFRxo3ZTBGfKdTwSIo14SLNPQAKQvY4lEX03j6LM=" 251 }, 252 "src/third_party/dom_distiller_js/dist": { 253 "url": "https://chromium.googlesource.com/chromium/dom-distiller/dist.git", ··· 796 }, 797 "src/v8": { 798 "url": "https://chromium.googlesource.com/v8/v8.git", 799 - "rev": "e5b4c78b54e8b033b2701db3df0bf67d3030e4c1", 800 - "hash": "sha256-5y/yNZopnwtDrG+BBU6fMEi0yJJoYvsygQR+fl6vS/Y=" 801 } 802 } 803 },
··· 1 { 2 "chromium": { 3 + "version": "138.0.7204.157", 4 "chromedriver": { 5 + "version": "138.0.7204.158", 6 + "hash_darwin": "sha256-rNd7glDAVNkd4CNn4k3rdpb//yD/ccpebnGhDv1EGb8=", 7 + "hash_darwin_aarch64": "sha256-oUMFW09mp2aUgplboMHaKvTVbKtqAy5C0KsA7DXbElc=" 8 }, 9 "deps": { 10 "depot_tools": { ··· 20 "DEPS": { 21 "src": { 22 "url": "https://chromium.googlesource.com/chromium/src.git", 23 + "rev": "e533e98b1267baa1f1c46d666b120e64e5146aa9", 24 + "hash": "sha256-LbZ8/6Lvz1p3ydRL4fXtd7RL426PU3jU01Hx+DP5QYQ=", 25 "recompress": true 26 }, 27 "src/third_party/clang-format/script": { ··· 96 }, 97 "src/third_party/angle": { 98 "url": "https://chromium.googlesource.com/angle/angle.git", 99 + "rev": "e1dc0a7ab5d1f1f2edaa7e41447d873895e083bf", 100 + "hash": "sha256-tkHvTkqbm4JtWnh41iu0aJ9Jo34hYc7aOKuuMQmST4c=" 101 }, 102 "src/third_party/angle/third_party/glmark2/src": { 103 "url": "https://chromium.googlesource.com/external/github.com/glmark2/glmark2", ··· 131 }, 132 "src/third_party/dawn": { 133 "url": "https://dawn.googlesource.com/dawn.git", 134 + "rev": "1fde167ae683982d77b9ca7e1308bf9f498291e8", 135 + "hash": "sha256-PbDTKSU19jn2hLDoazceYB/Rd6/qu6npPSrjOdeXFuU=" 136 }, 137 "src/third_party/dawn/third_party/glfw": { 138 "url": "https://chromium.googlesource.com/external/github.com/glfw/glfw", ··· 246 }, 247 "src/third_party/devtools-frontend/src": { 248 "url": "https://chromium.googlesource.com/devtools/devtools-frontend", 249 + "rev": "4cca0aa00c4915947f1081014d5cfa2e83d357fa", 250 + "hash": "sha256-pVNr8NB5U/Uf688oOvPLpu81isCn/WmjJky01A000a4=" 251 }, 252 "src/third_party/dom_distiller_js/dist": { 253 "url": "https://chromium.googlesource.com/chromium/dom-distiller/dist.git", ··· 796 }, 797 "src/v8": { 798 "url": "https://chromium.googlesource.com/v8/v8.git", 799 + "rev": "de9d0f8b56ae61896e4d2ac577fc589efb14f87d", 800 + "hash": "sha256-/T5fisjmN80bs3PtQrCRfH3Bo9dRSd3f+xpPLDh1RTY=" 801 } 802 } 803 },
+6 -3
pkgs/by-name/ar/archipelago/package.nix
··· 7 }: 8 let 9 pname = "archipelago"; 10 - version = "0.6.1"; 11 src = fetchurl { 12 url = "https://github.com/ArchipelagoMW/Archipelago/releases/download/${version}/Archipelago_${version}_linux-x86_64.AppImage"; 13 - hash = "sha256-8mPlR5xVnHL9I0rV4bMFaffSJv7dMlCcPHrLkM/pyVU="; 14 }; 15 16 appimageContents = appimageTools.extractType2 { inherit pname version src; }; ··· 40 changelog = "https://github.com/ArchipelagoMW/Archipelago/releases/tag/${version}"; 41 license = lib.licenses.mit; 42 mainProgram = "archipelago"; 43 - maintainers = with lib.maintainers; [ pyrox0 ]; 44 platforms = lib.platforms.linux; 45 }; 46 }
··· 7 }: 8 let 9 pname = "archipelago"; 10 + version = "0.6.2"; 11 src = fetchurl { 12 url = "https://github.com/ArchipelagoMW/Archipelago/releases/download/${version}/Archipelago_${version}_linux-x86_64.AppImage"; 13 + hash = "sha256-DdlfHb8iTCfTGGBUYQeELYh2NF/2GcamtuJzeYb2A5M="; 14 }; 15 16 appimageContents = appimageTools.extractType2 { inherit pname version src; }; ··· 40 changelog = "https://github.com/ArchipelagoMW/Archipelago/releases/tag/${version}"; 41 license = lib.licenses.mit; 42 mainProgram = "archipelago"; 43 + maintainers = with lib.maintainers; [ 44 + pyrox0 45 + iqubic 46 + ]; 47 platforms = lib.platforms.linux; 48 }; 49 }
+3 -3
pkgs/by-name/im/immich-public-proxy/package.nix
··· 8 }: 9 buildNpmPackage rec { 10 pname = "immich-public-proxy"; 11 - version = "1.11.3"; 12 src = fetchFromGitHub { 13 owner = "alangrainger"; 14 repo = "immich-public-proxy"; 15 tag = "v${version}"; 16 - hash = "sha256-rroccsVgPsBOTQ/2Mb+BoqOm59LdjqSqKsL40n7NXss="; 17 }; 18 19 sourceRoot = "${src.name}/app"; 20 21 - npmDepsHash = "sha256-9zuw24lPFsDWHrplShsCQDrUpBa6U+NeRVJNSI4OJHA="; 22 23 # patch in absolute nix store paths so the process doesn't need to cwd in $out 24 postPatch = ''
··· 8 }: 9 buildNpmPackage rec { 10 pname = "immich-public-proxy"; 11 + version = "1.11.5"; 12 src = fetchFromGitHub { 13 owner = "alangrainger"; 14 repo = "immich-public-proxy"; 15 tag = "v${version}"; 16 + hash = "sha256-jSAQbACWEt/gyZbr4sOM17t3KZoxPOM0RZFbsLZfcRM="; 17 }; 18 19 sourceRoot = "${src.name}/app"; 20 21 + npmDepsHash = "sha256-av+XKzrTl+8xizYFZwCTmaLNsbBnusf03I1Uvkp0sF8="; 22 23 # patch in absolute nix store paths so the process doesn't need to cwd in $out 24 postPatch = ''
+3 -3
pkgs/by-name/in/inputplumber/package.nix
··· 10 11 rustPlatform.buildRustPackage rec { 12 pname = "inputplumber"; 13 - version = "0.59.2"; 14 15 src = fetchFromGitHub { 16 owner = "ShadowBlip"; 17 repo = "InputPlumber"; 18 tag = "v${version}"; 19 - hash = "sha256-IAopZnGU0NOfpViLLetAm5BycTXyYL1fJ5WJW8qVnwA="; 20 }; 21 22 useFetchCargoVendor = true; 23 - cargoHash = "sha256-m/U9fYio39hkjcVDO3VlK5yJF9nWL9Y5B8D0FgD7LKk="; 24 25 nativeBuildInputs = [ 26 pkg-config
··· 10 11 rustPlatform.buildRustPackage rec { 12 pname = "inputplumber"; 13 + version = "0.60.2"; 14 15 src = fetchFromGitHub { 16 owner = "ShadowBlip"; 17 repo = "InputPlumber"; 18 tag = "v${version}"; 19 + hash = "sha256-zcy9scs7oRRLKm/FL6BfO64IstWY4HmTRxG/jJG0jLw="; 20 }; 21 22 useFetchCargoVendor = true; 23 + cargoHash = "sha256-fw7pM6HSy/8fNTYu7MqKiTl/2jdyDOLDBNhd0rpzb6M="; 24 25 nativeBuildInputs = [ 26 pkg-config
+3 -3
pkgs/by-name/li/libdeltachat/package.nix
··· 20 21 stdenv.mkDerivation rec { 22 pname = "libdeltachat"; 23 - version = "1.160.0"; 24 25 src = fetchFromGitHub { 26 owner = "chatmail"; 27 repo = "core"; 28 tag = "v${version}"; 29 - hash = "sha256-F88mDic6cnSa8mHhr+uX2WORFgJOu9LChLIS6DqWc40="; 30 }; 31 32 patches = [ ··· 36 cargoDeps = rustPlatform.fetchCargoVendor { 37 pname = "deltachat-core-rust"; 38 inherit version src; 39 - hash = "sha256-pZwCcAOYLKR6wfncIyuisYccNSGK+lqUg6lkyfKPgFk="; 40 }; 41 42 nativeBuildInputs =
··· 20 21 stdenv.mkDerivation rec { 22 pname = "libdeltachat"; 23 + version = "2.2.0"; 24 25 src = fetchFromGitHub { 26 owner = "chatmail"; 27 repo = "core"; 28 tag = "v${version}"; 29 + hash = "sha256-Evk2g2fqEmo/cd6+Sd76U0Byj6OEm99OZuUkoxTELbM="; 30 }; 31 32 patches = [ ··· 36 cargoDeps = rustPlatform.fetchCargoVendor { 37 pname = "deltachat-core-rust"; 38 inherit version src; 39 + hash = "sha256-vnnROLmsAh6mSPuQzTSbYSgxGfrKaanuLcADFE+kQeM="; 40 }; 41 42 nativeBuildInputs =
+2 -2
pkgs/by-name/pl/plasma-plugin-blurredwallpaper/package.nix
··· 6 }: 7 stdenvNoCC.mkDerivation (finalAttrs: { 8 pname = "plasma-plugin-blurredwallpaper"; 9 - version = "3.2.1"; 10 11 src = fetchFromGitHub { 12 owner = "bouteillerAlan"; 13 repo = "blurredwallpaper"; 14 rev = "v${finalAttrs.version}"; 15 - hash = "sha256-P/N7g/cl2K0R4NKebfqZnr9WQkHPSvHNbKbWiOxs76k="; 16 }; 17 18 installPhase = ''
··· 6 }: 7 stdenvNoCC.mkDerivation (finalAttrs: { 8 pname = "plasma-plugin-blurredwallpaper"; 9 + version = "3.3.1"; 10 11 src = fetchFromGitHub { 12 owner = "bouteillerAlan"; 13 repo = "blurredwallpaper"; 14 rev = "v${finalAttrs.version}"; 15 + hash = "sha256-hXuJhSS5QEgKWn60ctF3N+avfez8Ktrne3re/FY/VMU="; 16 }; 17 18 installPhase = ''
+2 -2
pkgs/by-name/ro/rockcraft/package.nix
··· 10 11 python3Packages.buildPythonApplication rec { 12 pname = "rockcraft"; 13 - version = "1.12.0"; 14 15 src = fetchFromGitHub { 16 owner = "canonical"; 17 repo = "rockcraft"; 18 rev = version; 19 - hash = "sha256-yv+TGDSUBKJf5X+73Do9KrAcCodeBPqpIHgpYZslR3o="; 20 }; 21 22 pyproject = true;
··· 10 11 python3Packages.buildPythonApplication rec { 12 pname = "rockcraft"; 13 + version = "1.13.0"; 14 15 src = fetchFromGitHub { 16 owner = "canonical"; 17 repo = "rockcraft"; 18 rev = version; 19 + hash = "sha256-pIOCgOC969Fj3lNnmsb6QTEV8z1KWxrUSsdl6Aogd4Q="; 20 }; 21 22 pyproject = true;
+6 -6
pkgs/by-name/sh/shopify-cli/manifests/package-lock.json
··· 1 { 2 "name": "shopify", 3 - "version": "3.82.0", 4 "lockfileVersion": 3, 5 "requires": true, 6 "packages": { 7 "": { 8 "name": "shopify", 9 - "version": "3.82.0", 10 "dependencies": { 11 - "@shopify/cli": "3.82.0" 12 }, 13 "bin": { 14 "shopify": "node_modules/@shopify/cli/bin/run.js" ··· 579 } 580 }, 581 "node_modules/@shopify/cli": { 582 - "version": "3.82.0", 583 - "resolved": "https://registry.npmjs.org/@shopify/cli/-/cli-3.82.0.tgz", 584 - "integrity": "sha512-y+Sq21Zr+vJVQu7z2wNKXXI4NnkACuh/Tt/KrAX7C+NntmKLXl7CZEaVesmJ5shpksG2up1iY1MgMYsDPoNpUA==", 585 "license": "MIT", 586 "os": [ 587 "darwin",
··· 1 { 2 "name": "shopify", 3 + "version": "3.82.1", 4 "lockfileVersion": 3, 5 "requires": true, 6 "packages": { 7 "": { 8 "name": "shopify", 9 + "version": "3.82.1", 10 "dependencies": { 11 + "@shopify/cli": "3.82.1" 12 }, 13 "bin": { 14 "shopify": "node_modules/@shopify/cli/bin/run.js" ··· 579 } 580 }, 581 "node_modules/@shopify/cli": { 582 + "version": "3.82.1", 583 + "resolved": "https://registry.npmjs.org/@shopify/cli/-/cli-3.82.1.tgz", 584 + "integrity": "sha512-iIABwasf+aMSBIjaPsKlVSaLp3vcOIPcfiitdoMUJKQhjIVbq8KdwaAa/MLUMe5B+l230zjq/xGB8U3JeJY0eg==", 585 "license": "MIT", 586 "os": [ 587 "darwin",
+2 -2
pkgs/by-name/sh/shopify-cli/manifests/package.json
··· 1 { 2 "name": "shopify", 3 - "version": "3.82.0", 4 "private": true, 5 "bin": { 6 "shopify": "node_modules/@shopify/cli/bin/run.js" 7 }, 8 "dependencies": { 9 - "@shopify/cli": "3.82.0" 10 } 11 }
··· 1 { 2 "name": "shopify", 3 + "version": "3.82.1", 4 "private": true, 5 "bin": { 6 "shopify": "node_modules/@shopify/cli/bin/run.js" 7 }, 8 "dependencies": { 9 + "@shopify/cli": "3.82.1" 10 } 11 }
+2 -2
pkgs/by-name/sh/shopify-cli/package.nix
··· 5 shopify-cli, 6 }: 7 let 8 - version = "3.82.0"; 9 in 10 buildNpmPackage { 11 pname = "shopify"; ··· 13 14 src = ./manifests; 15 16 - npmDepsHash = "sha256-liqEE0AXbj9L23xR6cpNK6b7CdL2pWvFFjL2S1lwKwQ="; 17 dontNpmBuild = true; 18 19 passthru = {
··· 5 shopify-cli, 6 }: 7 let 8 + version = "3.82.1"; 9 in 10 buildNpmPackage { 11 pname = "shopify"; ··· 13 14 src = ./manifests; 15 16 + npmDepsHash = "sha256-s0wlJxA3DUXRGBlLvyesLr9H/nbDc9yHBBWBLjQd8vE="; 17 dontNpmBuild = true; 18 19 passthru = {
+2 -2
pkgs/development/compilers/openjdk/generic.nix
··· 427 428 buildFlags = if atLeast17 then [ "images" ] else [ "all" ]; 429 430 - separateDebugInfo = true; 431 - __structuredAttrs = true; 432 433 # -j flag is explicitly rejected by the build system: 434 # Error: 'make -jN' is not supported, use 'make JOBS=N'
··· 427 428 buildFlags = if atLeast17 then [ "images" ] else [ "all" ]; 429 430 + separateDebugInfo = atLeast11; 431 + __structuredAttrs = atLeast11; 432 433 # -j flag is explicitly rejected by the build system: 434 # Error: 'make -jN' is not supported, use 'make JOBS=N'
+2
pkgs/development/python-modules/aider-chat/default.nix
··· 264 disabledTestPaths = [ 265 # Tests require network access 266 "tests/scrape/test_scrape.py" 267 # Expected 'mock' to have been called once 268 "tests/help/test_help.py" 269 ]; ··· 273 # Tests require network 274 "test_urls" 275 "test_get_commit_message_with_custom_prompt" 276 # FileNotFoundError 277 "test_get_commit_message" 278 # Expected 'launch_gui' to have been called once
··· 264 disabledTestPaths = [ 265 # Tests require network access 266 "tests/scrape/test_scrape.py" 267 + "tests/basic/test_repomap.py" 268 # Expected 'mock' to have been called once 269 "tests/help/test_help.py" 270 ]; ··· 274 # Tests require network 275 "test_urls" 276 "test_get_commit_message_with_custom_prompt" 277 + "test_cmd_tokens_output" 278 # FileNotFoundError 279 "test_get_commit_message" 280 # Expected 'launch_gui' to have been called once
+2 -2
pkgs/development/python-modules/django-types/default.nix
··· 8 9 buildPythonPackage rec { 10 pname = "django-types"; 11 - version = "0.20.0"; 12 pyproject = true; 13 14 src = fetchPypi { 15 pname = "django_types"; 16 inherit version; 17 - hash = "sha256-TlXSxWFV49addd756x2VqJEwPyrBn8z2/oBW2kKT+uc="; 18 }; 19 20 build-system = [ poetry-core ];
··· 8 9 buildPythonPackage rec { 10 pname = "django-types"; 11 + version = "0.22.0"; 12 pyproject = true; 13 14 src = fetchPypi { 15 pname = "django_types"; 16 inherit version; 17 + hash = "sha256-TOzJ7uhG5/8qOYvsnf5lQ+du+5IqeljF1gZLyw5qPcU="; 18 }; 19 20 build-system = [ poetry-core ];
+2 -2
pkgs/development/python-modules/edk2-pytool-library/default.nix
··· 17 18 buildPythonPackage rec { 19 pname = "edk2-pytool-library"; 20 - version = "0.23.4"; 21 pyproject = true; 22 23 disabled = pythonOlder "3.10"; ··· 26 owner = "tianocore"; 27 repo = "edk2-pytool-library"; 28 tag = "v${version}"; 29 - hash = "sha256-7wjNOwrTXhieUI7Etn6AdiNedoPKSIADtC7dc4Bdlxc="; 30 }; 31 32 build-system = [
··· 17 18 buildPythonPackage rec { 19 pname = "edk2-pytool-library"; 20 + version = "0.23.6"; 21 pyproject = true; 22 23 disabled = pythonOlder "3.10"; ··· 26 owner = "tianocore"; 27 repo = "edk2-pytool-library"; 28 tag = "v${version}"; 29 + hash = "sha256-62uWRr1n3C51OeHeCKKJrB1KLcjRGnwBCgpC0RPWum8="; 30 }; 31 32 build-system = [
+89
pkgs/development/python-modules/langchain-google-genai/default.nix
···
··· 1 + { 2 + lib, 3 + buildPythonPackage, 4 + fetchFromGitHub, 5 + 6 + # build-system 7 + poetry-core, 8 + 9 + # dependencies 10 + filetype, 11 + google-api-core, 12 + google-auth, 13 + google-generativeai, 14 + langchain-core, 15 + pydantic, 16 + 17 + # tests 18 + freezegun, 19 + langchain-tests, 20 + numpy, 21 + pytest-asyncio, 22 + pytest-mock, 23 + pytestCheckHook, 24 + syrupy, 25 + 26 + # passthru 27 + gitUpdater, 28 + }: 29 + 30 + buildPythonPackage rec { 31 + pname = "langchain-google-genai"; 32 + version = "2.1.5"; 33 + pyproject = true; 34 + 35 + src = fetchFromGitHub { 36 + owner = "langchain-ai"; 37 + repo = "langchain-google"; 38 + tag = "libs/genai/v${version}"; 39 + hash = "sha256-NCy4PHUSChsMVSebshDRGsg/koY7S4+mvI+GlIqW4q4="; 40 + }; 41 + 42 + sourceRoot = "${src.name}/libs/genai"; 43 + 44 + build-system = [ poetry-core ]; 45 + 46 + pythonRelaxDeps = [ 47 + # Each component release requests the exact latest core. 48 + # That prevents us from updating individual components. 49 + "langchain-core" 50 + ]; 51 + 52 + dependencies = [ 53 + filetype 54 + google-api-core 55 + google-auth 56 + google-generativeai 57 + langchain-core 58 + pydantic 59 + ]; 60 + 61 + nativeCheckInputs = [ 62 + freezegun 63 + langchain-tests 64 + numpy 65 + pytest-asyncio 66 + pytest-mock 67 + pytestCheckHook 68 + syrupy 69 + ]; 70 + 71 + pytestFlagsArray = [ "tests/unit_tests" ]; 72 + 73 + pythonImportsCheck = [ "langchain_google_genai" ]; 74 + 75 + passthru.updateScript = gitUpdater { 76 + rev-prefix = "libs/genai/v"; 77 + }; 78 + 79 + meta = { 80 + changelog = "https://github.com/langchain-ai/langchain-google/releases/tag/${src.tag}"; 81 + description = "LangChain integrations for Google Gemini"; 82 + homepage = "https://github.com/langchain-ai/langchain-google/tree/main/libs/genai"; 83 + license = lib.licenses.mit; 84 + maintainers = [ 85 + lib.maintainers.eu90h 86 + lib.maintainers.sarahec 87 + ]; 88 + }; 89 + }
+2
pkgs/development/python-modules/posthog/default.nix
··· 15 requests, 16 setuptools, 17 six, 18 }: 19 20 buildPythonPackage rec { ··· 38 python-dateutil 39 requests 40 six 41 ]; 42 43 nativeCheckInputs = [
··· 15 requests, 16 setuptools, 17 six, 18 + typing-extensions, 19 }: 20 21 buildPythonPackage rec { ··· 39 python-dateutil 40 requests 41 six 42 + typing-extensions 43 ]; 44 45 nativeCheckInputs = [
+31 -3
pkgs/development/python-modules/rclone-python/default.nix
··· 2 lib, 3 buildPythonPackage, 4 fetchFromGitHub, 5 setuptools, 6 rich, 7 rclone, 8 }: 9 10 buildPythonPackage rec { ··· 18 tag = "v${version}"; 19 hash = "sha256-vvsiXS3uI0TcL+X8+75BQmycrF+EGIgQE1dmGef35rI="; 20 }; 21 22 build-system = [ setuptools ]; 23 24 dependencies = [ 25 - rclone 26 rich 27 ]; 28 29 - # tests require working internet connection 30 - doCheck = false; 31 32 pythonImportsCheck = [ "rclone_python" ]; 33
··· 2 lib, 3 buildPythonPackage, 4 fetchFromGitHub, 5 + pytestCheckHook, 6 + replaceVars, 7 setuptools, 8 rich, 9 rclone, 10 + writableTmpDirAsHomeHook, 11 }: 12 13 buildPythonPackage rec { ··· 21 tag = "v${version}"; 22 hash = "sha256-vvsiXS3uI0TcL+X8+75BQmycrF+EGIgQE1dmGef35rI="; 23 }; 24 + 25 + patches = [ 26 + (replaceVars ./hardcode-rclone-path.patch { 27 + rclone = lib.getExe rclone; 28 + }) 29 + ]; 30 31 build-system = [ setuptools ]; 32 33 dependencies = [ 34 rich 35 ]; 36 37 + nativeCheckInputs = [ 38 + pytestCheckHook 39 + writableTmpDirAsHomeHook 40 + ]; 41 + 42 + preCheck = '' 43 + # Unlike upstream we don't actually run an S3 server for testing. 44 + # See https://github.com/Johannes11833/rclone_python/blob/master/launch_test_server.sh 45 + mkdir -p "$HOME/.config/rclone" 46 + cat > "$HOME/.config/rclone/rclone.conf" <<EOF 47 + [test_server_s3] 48 + type = combine 49 + upstreams = "testdir=$(mktemp -d)" 50 + EOF 51 + ''; 52 + 53 + disabledTestPaths = [ 54 + # test requires a remote that supports public links 55 + "tests/test_link.py" 56 + # test looks up latest version on rclone.org 57 + "tests/test_version.py" 58 + ]; 59 60 pythonImportsCheck = [ "rclone_python" ]; 61
+129
pkgs/development/python-modules/rclone-python/hardcode-rclone-path.patch
···
··· 1 + diff --git a/rclone_python/rclone.py b/rclone_python/rclone.py 2 + index da399b4..e05365a 100644 3 + --- a/rclone_python/rclone.py 4 + +++ b/rclone_python/rclone.py 5 + @@ -43,7 +43,7 @@ def is_installed() -> bool: 6 + """ 7 + :return: True if rclone is correctly installed on the system. 8 + """ 9 + - return which("rclone") is not None 10 + + return True 11 + 12 + 13 + @__check_installed 14 + @@ -199,7 +199,7 @@ def copy( 15 + in_path, 16 + out_path, 17 + ignore_existing=ignore_existing, 18 + - command="rclone copy", 19 + + command="@rclone@ copy", 20 + command_descr="Copying", 21 + show_progress=show_progress, 22 + listener=listener, 23 + @@ -234,7 +234,7 @@ def copyto( 24 + in_path, 25 + out_path, 26 + ignore_existing=ignore_existing, 27 + - command="rclone copyto", 28 + + command="@rclone@ copyto", 29 + command_descr="Copying", 30 + show_progress=show_progress, 31 + listener=listener, 32 + @@ -269,7 +269,7 @@ def move( 33 + in_path, 34 + out_path, 35 + ignore_existing=ignore_existing, 36 + - command="rclone move", 37 + + command="@rclone@ move", 38 + command_descr="Moving", 39 + show_progress=show_progress, 40 + listener=listener, 41 + @@ -304,7 +304,7 @@ def moveto( 42 + in_path, 43 + out_path, 44 + ignore_existing=ignore_existing, 45 + - command="rclone moveto", 46 + + command="@rclone@ moveto", 47 + command_descr="Moving", 48 + show_progress=show_progress, 49 + listener=listener, 50 + @@ -336,7 +336,7 @@ def sync( 51 + _rclone_transfer_operation( 52 + src_path, 53 + dest_path, 54 + - command="rclone sync", 55 + + command="@rclone@ sync", 56 + command_descr="Syncing", 57 + show_progress=show_progress, 58 + listener=listener, 59 + diff --git a/rclone_python/scripts/get_version.py b/rclone_python/scripts/get_version.py 60 + index b1d30fd..bc00cad 100644 61 + --- a/rclone_python/scripts/get_version.py 62 + +++ b/rclone_python/scripts/get_version.py 63 + @@ -2,6 +2,6 @@ from subprocess import check_output 64 + 65 + 66 + def get_version(): 67 + - stdout = check_output("rclone version", shell=True, encoding="utf8") 68 + + stdout = check_output("@rclone@ version", shell=True, encoding="utf8") 69 + 70 + return stdout.split("\n")[0].replace("rclone ", "") 71 + diff --git a/rclone_python/scripts/update_hash_types.py b/rclone_python/scripts/update_hash_types.py 72 + index 92fbd0a..ef963cf 100644 73 + --- a/rclone_python/scripts/update_hash_types.py 74 + +++ b/rclone_python/scripts/update_hash_types.py 75 + @@ -14,7 +14,7 @@ def update_hashes(output_path: str): 76 + """ 77 + 78 + # get all supported backends 79 + - rclone_output = sp.check_output("rclone hashsum", shell=True, encoding="utf8") 80 + + rclone_output = sp.check_output("@rclone@ hashsum", shell=True, encoding="utf8") 81 + lines = rclone_output.splitlines() 82 + 83 + hashes = [] 84 + diff --git a/rclone_python/utils.py b/rclone_python/utils.py 85 + index d4a8413..1b29bd8 100644 86 + --- a/rclone_python/utils.py 87 + +++ b/rclone_python/utils.py 88 + @@ -66,9 +66,9 @@ def run_rclone_cmd( 89 + # otherwise the default rclone config path is used: 90 + config = Config() 91 + if config.config_path is not None: 92 + - base_command = f"rclone --config={config.config_path}" 93 + + base_command = f"@rclone@ --config={config.config_path}" 94 + else: 95 + - base_command = "rclone" 96 + + base_command = "@rclone@" 97 + 98 + # add optional arguments and flags to the command 99 + args_str = args2string(args) 100 + diff --git a/tests/test_copy.py b/tests/test_copy.py 101 + index 4ded5fa..1cae53b 100644 102 + --- a/tests/test_copy.py 103 + +++ b/tests/test_copy.py 104 + @@ -45,11 +45,11 @@ def create_local_file( 105 + @pytest.mark.parametrize( 106 + "wrapper_command,rclone_command", 107 + [ 108 + - (rclone.copy, "rclone copy"), 109 + - (rclone.copyto, "rclone copyto"), 110 + - (rclone.sync, "rclone sync"), 111 + - (rclone.move, "rclone move"), 112 + - (rclone.moveto, "rclone moveto"), 113 + + (rclone.copy, "@rclone@ copy"), 114 + + (rclone.copyto, "@rclone@ copyto"), 115 + + (rclone.sync, "@rclone@ sync"), 116 + + (rclone.move, "@rclone@ move"), 117 + + (rclone.moveto, "@rclone@ moveto"), 118 + ], 119 + ) 120 + def test_rclone_command_called(wrapper_command: Callable, rclone_command: str): 121 + @@ -62,7 +62,7 @@ def test_rclone_command_called(wrapper_command: Callable, rclone_command: str): 122 + rclone.utils.subprocess, 123 + "Popen", 124 + return_value=subprocess.Popen( 125 + - "rclone help", stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True 126 + + "@rclone@ help", stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True 127 + ), 128 + ) as mock: 129 + wrapper_command("nothing/not_a.file", "fake_remote:unicorn/folder")
+2
pkgs/top-level/python-packages.nix
··· 7807 7808 langchain-fireworks = callPackage ../development/python-modules/langchain-fireworks { }; 7809 7810 langchain-groq = callPackage ../development/python-modules/langchain-groq { }; 7811 7812 langchain-huggingface = callPackage ../development/python-modules/langchain-huggingface { };
··· 7807 7808 langchain-fireworks = callPackage ../development/python-modules/langchain-fireworks { }; 7809 7810 + langchain-google-genai = callPackage ../development/python-modules/langchain-google-genai { }; 7811 + 7812 langchain-groq = callPackage ../development/python-modules/langchain-groq { }; 7813 7814 langchain-huggingface = callPackage ../development/python-modules/langchain-huggingface { };