···10191020The `buildPythonPackage` mainly does four things:
10211022-* In the `buildPhase`, it calls `${python.interpreter} setup.py bdist_wheel` to
1023 build a wheel binary zipfile.
1024* In the `installPhase`, it installs the wheel file using `pip install *.whl`.
1025* In the `postFixup` phase, the `wrapPythonPrograms` bash function is called to
···1546As workaround install it as an extra `preInstall` step:
15471548```shell
1549-${python.interpreter} setup.py install_data --install-dir=$out --root=$out
1550sed -i '/ = data\_files/d' setup.py
1551```
1552···18201821Updating packages in bulk leads to lots of breakages, which is why a
1822stabilization period on the `python-unstable` branch is required.
0000018231824Once the branch is sufficiently stable it should normally be merged
1825into the `staging` branch.
···10191020The `buildPythonPackage` mainly does four things:
10211022+* In the `buildPhase`, it calls `${python.pythonForBuild.interpreter} setup.py bdist_wheel` to
1023 build a wheel binary zipfile.
1024* In the `installPhase`, it installs the wheel file using `pip install *.whl`.
1025* In the `postFixup` phase, the `wrapPythonPrograms` bash function is called to
···1546As workaround install it as an extra `preInstall` step:
15471548```shell
1549+${python.pythonForBuild.interpreter} setup.py install_data --install-dir=$out --root=$out
1550sed -i '/ = data\_files/d' setup.py
1551```
1552···18201821Updating packages in bulk leads to lots of breakages, which is why a
1822stabilization period on the `python-unstable` branch is required.
1823+1824+If a package is fragile and often breaks during these bulks updates, it
1825+may be reasonable to set `passthru.skipBulkUpdate = true` in the
1826+derivation. This decision should not be made on a whim and should
1827+always be supported by a qualifying comment.
18281829Once the branch is sufficiently stable it should normally be merged
1830into the `staging` branch.
···117118- The [services.wordpress.sites.<name>.plugins](#opt-services.wordpress.sites._name_.plugins) and [services.wordpress.sites.<name>.themes](#opt-services.wordpress.sites._name_.themes) options have been converted from sets to attribute sets to allow for consumers to specify explicit install paths via attribute name.
11900120- Nebula now runs as a system user and group created for each nebula network, using the `CAP_NET_ADMIN` ambient capability on launch rather than starting as root. Ensure that any files each Nebula instance needs to access are owned by the correct user and group, by default `nebula-${networkName}`.
121122- In `mastodon` it is now necessary to specify location of file with `PostgreSQL` database password. In `services.mastodon.database.passwordFile` parameter default value `/var/lib/mastodon/secrets/db-password` has been changed to `null`.
···257258- The `unifi-poller` package and corresponding NixOS module have been renamed to `unpoller` to match upstream.
25900260- The new option `services.tailscale.useRoutingFeatures` controls various settings for using Tailscale features like exit nodes and subnet routers. If you wish to use your machine as an exit node, you can set this setting to `server`, otherwise if you wish to use an exit node you can set this setting to `client`. The strict RPF warning has been removed as the RPF will be loosened automatically based on the value of this setting.
261262- `openjdk` from version 11 and above is not build with `openjfx` (i.e.: JavaFX) support by default anymore. You can re-enable it by overriding, e.g.: `openjdk11.override { enableJavaFX = true; };`.
···270- The option `services.nomad.extraSettingsPlugins` has been fixed to allow more than one plugin in the path.
271272- The option `services.prometheus.exporters.pihole.interval` does not exist anymore and has been removed.
00
···117118- The [services.wordpress.sites.<name>.plugins](#opt-services.wordpress.sites._name_.plugins) and [services.wordpress.sites.<name>.themes](#opt-services.wordpress.sites._name_.themes) options have been converted from sets to attribute sets to allow for consumers to specify explicit install paths via attribute name.
119120+- `protonmail-bridge` package has been updated to v3.0 and the CLI executable is now named bridge instead of protonmail-bridge to be more in line with upstream.
121+122- Nebula now runs as a system user and group created for each nebula network, using the `CAP_NET_ADMIN` ambient capability on launch rather than starting as root. Ensure that any files each Nebula instance needs to access are owned by the correct user and group, by default `nebula-${networkName}`.
123124- In `mastodon` it is now necessary to specify location of file with `PostgreSQL` database password. In `services.mastodon.database.passwordFile` parameter default value `/var/lib/mastodon/secrets/db-password` has been changed to `null`.
···259260- The `unifi-poller` package and corresponding NixOS module have been renamed to `unpoller` to match upstream.
261262+- `protonmail-bridge` package has been updated to v3.0 and the CLI executable is now named bridge instead of protonmail-bridge to be more in line with upstream.
263+264- The new option `services.tailscale.useRoutingFeatures` controls various settings for using Tailscale features like exit nodes and subnet routers. If you wish to use your machine as an exit node, you can set this setting to `server`, otherwise if you wish to use an exit node you can set this setting to `client`. The strict RPF warning has been removed as the RPF will be loosened automatically based on the value of this setting.
265266- `openjdk` from version 11 and above is not build with `openjfx` (i.e.: JavaFX) support by default anymore. You can re-enable it by overriding, e.g.: `openjdk11.override { enableJavaFX = true; };`.
···274- The option `services.nomad.extraSettingsPlugins` has been fixed to allow more than one plugin in the path.
275276- The option `services.prometheus.exporters.pihole.interval` does not exist anymore and has been removed.
277+278+- `k3s` can now be configured with an EnvironmentFile for its systemd service, allowing secrets to be provided without ending up in the Nix Store.
···12}:
13rustPlatform.buildRustPackage rec {
14 pname = "polkadot";
15- version = "0.9.37";
1617 src = fetchFromGitHub {
18 owner = "paritytech";
19 repo = "polkadot";
20 rev = "v${version}";
21- hash = "sha256-/mgJNjliPUmMkhT/1oiX9+BJHfY3SMsKfFv9HCyWRQQ=";
2223 # the build process of polkadot requires a .git folder in order to determine
24 # the git commit hash that is being built and add it to the version string.
···34 '';
35 };
3637- cargoHash = "sha256-o+APFYKgA3zjQSGrkpnyf5LEBBqvZtcfWlzCk6nL02A=";
3839 buildInputs = lib.optionals stdenv.isDarwin [ Security SystemConfiguration ];
40
···12}:
13rustPlatform.buildRustPackage rec {
14 pname = "polkadot";
15+ version = "0.9.38";
1617 src = fetchFromGitHub {
18 owner = "paritytech";
19 repo = "polkadot";
20 rev = "v${version}";
21+ hash = "sha256-qS9LZ9KBjOw7hEkUzu7eZFj6ZwbkCDxoqA7FPXb13o4=";
2223 # the build process of polkadot requires a .git folder in order to determine
24 # the git commit hash that is being built and add it to the version string.
···34 '';
35 };
3637+ cargoHash = "sha256-4BOgG/NzSppTeEtoEVxqlYjV4FGkNFMeF+qCJwPz+7o=";
3839 buildInputs = lib.optionals stdenv.isDarwin [ Security SystemConfiguration ];
40
···58 maintainers = with maintainers; [ artturin ];
59 # NOTE: CopyQ supports windows and osx, but I cannot test these.
60 platforms = platforms.linux;
061 };
62}
···58 maintainers = with maintainers; [ artturin ];
59 # NOTE: CopyQ supports windows and osx, but I cannot test these.
60 platforms = platforms.linux;
61+ mainProgram = "copyq";
62 };
63}
···15 # The fs-repo-migrations code itself is the same between
16 # the two versions but the migration code, which is built
17 # into separate binaries, is not.
18- rev = "fs-repo-11-to-12/v1.0.2";
19- sha256 = "sha256-CG4utwH+/+Igw+SP3imhl39wijlB53UGtkJG5Mwh+Ik=";
20 };
2122 sourceRoot = "source/fs-repo-migrations";
2324- vendorSha256 = "sha256-/DqkBBtR/nU8gk3TFqNKY5zQU6BFMc3N8Ti+38mi/jk=";
2526 doCheck = false;
27
···15 # The fs-repo-migrations code itself is the same between
16 # the two versions but the migration code, which is built
17 # into separate binaries, is not.
18+ rev = "fs-repo-12-to-13/v1.0.0";
19+ hash = "sha256-QQone7E2Be+jVfnrwqQ1Ny4jo6mSDHhaY3ErkNdn2f8=";
20 };
2122 sourceRoot = "source/fs-repo-migrations";
2324+ vendorHash = "sha256-/DqkBBtR/nU8gk3TFqNKY5zQU6BFMc3N8Ti+38mi/jk=";
2526 doCheck = false;
27
···12"""
1314import argparse
0015import os
16-import pathlib
17import re
18import requests
19from concurrent.futures import ThreadPoolExecutor as Pool
20from packaging.version import Version as _Version
21from packaging.version import InvalidVersion
22from packaging.specifiers import SpecifierSet
023import collections
24import subprocess
25···3132PRERELEASES = False
330034GIT = "git"
3536-NIXPGKS_ROOT = subprocess.check_output(["git", "rev-parse", "--show-toplevel"]).decode('utf-8').strip()
3738-import logging
39logging.basicConfig(level=logging.INFO)
4041···67 values = regex.findall(text)
68 return values
69000000000000000070def _get_unique_value(attribute, text):
71 """Match attribute in text and return unique match.
72···81 else:
82 raise ValueError("no value found for {}".format(attribute))
8384-def _get_line_and_value(attribute, text):
85 """Match attribute in text. Return the line and the value of the attribute."""
86- regex = '({}\s+=\s+"(.*)";)'.format(attribute)
00087 regex = re.compile(regex)
88- value = regex.findall(text)
89- n = len(value)
90 if n > 1:
91 raise ValueError("found too many values for {}".format(attribute))
92 elif n == 1:
93- return value[0]
94 else:
95 raise ValueError("no value found for {}".format(attribute))
969798-def _replace_value(attribute, value, text):
99 """Search and replace value of attribute in text."""
100- old_line, old_value = _get_line_and_value(attribute, text)
000101 new_line = old_line.replace(old_value, value)
102 new_text = text.replace(old_line, new_line)
103 return new_text
···122 return r.json()
123 else:
124 raise ValueError("request for {} failed".format(url))
00000000000000000125126127SEMVER = {
···198 attr_path = os.environ.get("UPDATE_NIX_ATTR_PATH", f"python3Packages.{package}")
199 try:
200 homepage = subprocess.check_output(
201- ["nix", "eval", "-f", f"{NIXPGKS_ROOT}/default.nix", "--raw", f"{attr_path}.src.meta.homepage"])\
202 .decode('utf-8')
203 except Exception as e:
204 raise ValueError(f"Unable to determine homepage: {e}")
···217218 release = next(filter(lambda x: strip_prefix(x['tag_name']) == version, releases))
219 prefix = get_prefix(release['tag_name'])
220- try:
221- sha256 = subprocess.check_output(["nix-prefetch-url", "--type", "sha256", "--unpack", f"{release['tarball_url']}"], stderr=subprocess.DEVNULL)\
222- .decode('utf-8').strip()
223- except:
224- # this may fail if they have both a branch and a tag of the same name, attempt tag name
225- tag_url = str(release['tarball_url']).replace("tarball","tarball/refs/tags")
226- sha256 = subprocess.check_output(["nix-prefetch-url", "--type", "sha256", "--unpack", tag_url], stderr=subprocess.DEVNULL)\
227- .decode('utf-8').strip()
22800000000229230- return version, sha256, prefix
000000000000000000000000000000231232233FETCHERS = {
···272 if fetcher == 'fetchPypi':
273 try:
274 src_format = _get_unique_value('format', text)
275- except ValueError as e:
276 src_format = None # format was not given
277278 try:
279 extension = _get_unique_value('extension', text)
280- except ValueError as e:
281 extension = None # extension was not given
282283 if extension is None:
···294 raise ValueError('url does not point to PyPI.')
295296 elif fetcher == 'fetchFromGitHub':
297- if "fetchSubmodules" in text:
298- raise ValueError("fetchFromGitHub fetcher doesn't support submodules")
299 extension = "tar.gz"
300301 return extension
···321 # Attempt a fetch using each pname, e.g. backports-zoneinfo vs backports.zoneinfo
322 successful_fetch = False
323 for pname in pnames:
00324 try:
325 new_version, new_sha256, prefix = FETCHERS[fetcher](pname, extension, version, target)
326 successful_fetch = True
···340 raise ValueError("no file available for {}.".format(pname))
341342 text = _replace_value('version', new_version, text)
0343 # hashes from pypi are 16-bit encoded sha256's, normalize it to sri to avoid merge conflicts
344 # sri hashes have been the default format since nix 2.4+
345- sri_hash = subprocess.check_output(["nix", "--extra-experimental-features", "nix-command", "hash", "to-sri", "--type", "sha256", new_sha256]).decode('utf-8').strip()
346347-348- # fetchers can specify a sha256, or a sri hash
349- try:
350- text = _replace_value('sha256', sri_hash, text)
351- except ValueError:
352- text = _replace_value('hash', sri_hash, text)
000353354 if fetcher == 'fetchFromGitHub':
355 # in the case of fetchFromGitHub, it's common to see `rev = version;` or `rev = "v${version}";`
···440 target = args.target
441442 packages = list(map(os.path.abspath, args.package))
0000443444 logging.info("Updating packages...")
445
···12"""
1314import argparse
15+import json
16+import logging
17import os
018import re
19import requests
20from concurrent.futures import ThreadPoolExecutor as Pool
21from packaging.version import Version as _Version
22from packaging.version import InvalidVersion
23from packaging.specifiers import SpecifierSet
24+from typing import Optional, Any
25import collections
26import subprocess
27···3334PRERELEASES = False
3536+BULK_UPDATE = False
37+38GIT = "git"
3940+NIXPKGS_ROOT = subprocess.check_output(["git", "rev-parse", "--show-toplevel"]).decode('utf-8').strip()
41042logging.basicConfig(level=logging.INFO)
4344···70 values = regex.findall(text)
71 return values
7273+74+def _get_attr_value(attr_path: str) -> Optional[Any]:
75+ try:
76+ response = subprocess.check_output([
77+ "nix",
78+ "--extra-experimental-features", "nix-command",
79+ "eval",
80+ "-f", f"{NIXPKGS_ROOT}/default.nix",
81+ "--json",
82+ f"{attr_path}"
83+ ])
84+ return json.loads(response.decode())
85+ except (subprocess.CalledProcessError, ValueError):
86+ return None
87+88+89def _get_unique_value(attribute, text):
90 """Match attribute in text and return unique match.
91···100 else:
101 raise ValueError("no value found for {}".format(attribute))
102103+def _get_line_and_value(attribute, text, value=None):
104 """Match attribute in text. Return the line and the value of the attribute."""
105+ if value is None:
106+ regex = rf'({attribute}\s+=\s+\"(.*)\";)'
107+ else:
108+ regex = rf'({attribute}\s+=\s+\"({value})\";)'
109 regex = re.compile(regex)
110+ results = regex.findall(text)
111+ n = len(results)
112 if n > 1:
113 raise ValueError("found too many values for {}".format(attribute))
114 elif n == 1:
115+ return results[0]
116 else:
117 raise ValueError("no value found for {}".format(attribute))
118119120+def _replace_value(attribute, value, text, oldvalue=None):
121 """Search and replace value of attribute in text."""
122+ if oldvalue is None:
123+ old_line, old_value = _get_line_and_value(attribute, text)
124+ else:
125+ old_line, old_value = _get_line_and_value(attribute, text, oldvalue)
126 new_line = old_line.replace(old_value, value)
127 new_text = text.replace(old_line, new_line)
128 return new_text
···147 return r.json()
148 else:
149 raise ValueError("request for {} failed".format(url))
150+151+152+def _hash_to_sri(algorithm, value):
153+ """Convert a hash to its SRI representation"""
154+ return subprocess.check_output([
155+ "nix",
156+ "hash",
157+ "to-sri",
158+ "--type", algorithm,
159+ value
160+ ]).decode().strip()
161+162+163+def _skip_bulk_update(attr_name: str) -> bool:
164+ return bool(_get_attr_value(
165+ f"{attr_name}.skipBulkUpdate"
166+ ))
167168169SEMVER = {
···240 attr_path = os.environ.get("UPDATE_NIX_ATTR_PATH", f"python3Packages.{package}")
241 try:
242 homepage = subprocess.check_output(
243+ ["nix", "eval", "-f", f"{NIXPKGS_ROOT}/default.nix", "--raw", f"{attr_path}.src.meta.homepage"])\
244 .decode('utf-8')
245 except Exception as e:
246 raise ValueError(f"Unable to determine homepage: {e}")
···259260 release = next(filter(lambda x: strip_prefix(x['tag_name']) == version, releases))
261 prefix = get_prefix(release['tag_name'])
00000000262263+ # some attributes require using the fetchgit
264+ git_fetcher_args = []
265+ if (_get_attr_value(f"{attr_path}.src.fetchSubmodules")):
266+ git_fetcher_args.append("--fetch-submodules")
267+ if (_get_attr_value(f"{attr_path}.src.fetchLFS")):
268+ git_fetcher_args.append("--fetch-lfs")
269+ if (_get_attr_value(f"{attr_path}.src.leaveDotGit")):
270+ git_fetcher_args.append("--leave-dotGit")
271272+ if git_fetcher_args:
273+ algorithm = "sha256"
274+ cmd = [
275+ "nix-prefetch-git",
276+ f"https://github.com/{owner}/{repo}.git",
277+ "--hash", algorithm,
278+ "--rev", f"refs/tags/{release['tag_name']}"
279+ ]
280+ cmd.extend(git_fetcher_args)
281+ response = subprocess.check_output(cmd)
282+ document = json.loads(response.decode())
283+ hash = _hash_to_sri(algorithm, document[algorithm])
284+ else:
285+ try:
286+ hash = subprocess.check_output([
287+ "nix-prefetch-url",
288+ "--type", "sha256",
289+ "--unpack",
290+ f"{release['tarball_url']}"
291+ ], stderr=subprocess.DEVNULL).decode('utf-8').strip()
292+ except (subprocess.CalledProcessError, UnicodeError):
293+ # this may fail if they have both a branch and a tag of the same name, attempt tag name
294+ tag_url = str(release['tarball_url']).replace("tarball","tarball/refs/tags")
295+ hash = subprocess.check_output([
296+ "nix-prefetch-url",
297+ "--type", "sha256",
298+ "--unpack",
299+ tag_url
300+ ], stderr=subprocess.DEVNULL).decode('utf-8').strip()
301+302+ return version, hash, prefix
303304305FETCHERS = {
···344 if fetcher == 'fetchPypi':
345 try:
346 src_format = _get_unique_value('format', text)
347+ except ValueError:
348 src_format = None # format was not given
349350 try:
351 extension = _get_unique_value('extension', text)
352+ except ValueError:
353 extension = None # extension was not given
354355 if extension is None:
···366 raise ValueError('url does not point to PyPI.')
367368 elif fetcher == 'fetchFromGitHub':
00369 extension = "tar.gz"
370371 return extension
···391 # Attempt a fetch using each pname, e.g. backports-zoneinfo vs backports.zoneinfo
392 successful_fetch = False
393 for pname in pnames:
394+ if BULK_UPDATE and _skip_bulk_update(f"python3Packages.{pname}"):
395+ raise ValueError(f"Bulk update skipped for {pname}")
396 try:
397 new_version, new_sha256, prefix = FETCHERS[fetcher](pname, extension, version, target)
398 successful_fetch = True
···412 raise ValueError("no file available for {}.".format(pname))
413414 text = _replace_value('version', new_version, text)
415+416 # hashes from pypi are 16-bit encoded sha256's, normalize it to sri to avoid merge conflicts
417 # sri hashes have been the default format since nix 2.4+
418+ sri_hash = _hash_to_sri("sha256", new_sha256)
419420+ # retrieve the old output hash for a more precise match
421+ if old_hash := _get_attr_value(f"python3Packages.{pname}.src.outputHash"):
422+ # fetchers can specify a sha256, or a sri hash
423+ try:
424+ text = _replace_value('hash', sri_hash, text, old_hash)
425+ except ValueError:
426+ text = _replace_value('sha256', sri_hash, text, old_hash)
427+ else:
428+ raise ValueError(f"Unable to retrieve old hash for {pname}")
429430 if fetcher == 'fetchFromGitHub':
431 # in the case of fetchFromGitHub, it's common to see `rev = version;` or `rev = "v${version}";`
···516 target = args.target
517518 packages = list(map(os.path.abspath, args.package))
519+520+ if len(packages) > 1:
521+ global BULK_UPDATE
522+ BULK_UPDATE = true
523524 logging.info("Updating packages...")
525
···63 "--without-ldb-lmdb"
64 ];
650000066 stripDebugList = [ "bin" "lib" "modules" ];
6768 meta = with lib; {
···63 "--without-ldb-lmdb"
64 ];
6566+ # python-config from build Python gives incorrect values when cross-compiling.
67+ # If python-config is not found, the build falls back to using the sysconfig
68+ # module, which works correctly in all cases.
69+ PYTHON_CONFIG = "/invalid";
70+71 stripDebugList = [ "bin" "lib" "modules" ];
7273 meta = with lib; {
+1
pkgs/development/libraries/libre/default.nix
···13 ++ lib.optional (stdenv.cc.cc != null) "SYSROOT_ALT=${stdenv.cc.cc}"
14 ++ lib.optional (stdenv.cc.libc != null) "SYSROOT=${lib.getDev stdenv.cc.libc}"
15 ;
016 meta = {
17 description = "A library for real-time communications with async IO support and a complete SIP stack";
18 homepage = "https://github.com/baresip/re";
···13 ++ lib.optional (stdenv.cc.cc != null) "SYSROOT_ALT=${stdenv.cc.cc}"
14 ++ lib.optional (stdenv.cc.libc != null) "SYSROOT=${lib.getDev stdenv.cc.libc}"
15 ;
16+ enableParallelBuilding = true;
17 meta = {
18 description = "A library for real-time communications with async IO support and a complete SIP stack";
19 homepage = "https://github.com/baresip/re";
+1
pkgs/development/libraries/librem/default.nix
···17 ++ lib.optional (stdenv.cc.cc != null) "SYSROOT_ALT=${lib.getDev stdenv.cc.cc}"
18 ++ lib.optional (stdenv.cc.libc != null) "SYSROOT=${lib.getDev stdenv.cc.libc}"
19 ;
020 meta = {
21 description = "A library for real-time audio and video processing";
22 homepage = "https://github.com/baresip/rem";
···17 ++ lib.optional (stdenv.cc.cc != null) "SYSROOT_ALT=${lib.getDev stdenv.cc.cc}"
18 ++ lib.optional (stdenv.cc.libc != null) "SYSROOT=${lib.getDev stdenv.cc.libc}"
19 ;
20+ enableParallelBuilding = true;
21 meta = {
22 description = "A library for real-time audio and video processing";
23 homepage = "https://github.com/baresip/rem";
···19in
20buildPythonPackage rec {
21 pname = "ml-collections";
22- version = "0.1.0";
2324 # ml-collections does not have any git release tags. See https://github.com/google/ml_collections/issues/8.
25 src = fetchPypi {
26 inherit version;
27 pname = "ml_collections";
28- sha256 = "0g6gxfz8g6fh1sghys869ylxgpda9hq7ylc8jw05608l3k6pz8ar";
29 };
3031 # The pypi source archive does not include requirements.txt or
···19in
20buildPythonPackage rec {
21 pname = "ml-collections";
22+ version = "0.1.1";
2324 # ml-collections does not have any git release tags. See https://github.com/google/ml_collections/issues/8.
25 src = fetchPypi {
26 inherit version;
27 pname = "ml_collections";
28+ sha256 = "sha256-P+/McuxDOqHl0yMHo+R0u7Z/QFvoFOpSohZr/J2+aMw=";
29 };
3031 # The pypi source archive does not include requirements.txt or
···130 ]
131 },
132 "notify_push": {
133- "sha256": "1raxkzdcd9mixg30ifv22lzf10j47n79n05yqbf6mjagrgj0rr7f",
134- "url": "https://github.com/nextcloud/notify_push/releases/download/v0.5.0/notify_push.tar.gz",
135- "version": "0.5.0",
136 "description": "Push update support for desktop app.\n\nOnce the app is installed, the push binary needs to be setup. You can either use the setup wizard with `occ notify_push:setup` or see the [README](http://github.com/nextcloud/notify_push) for detailed setup instructions",
137 "homepage": "",
138 "licenses": [
···130 ]
131 },
132 "notify_push": {
133+ "sha256": "1vfa68spnyfivcx0vp49mimf5xg7hsxnifd06imd1c0mw3nlfm4p",
134+ "url": "https://github.com/nextcloud-releases/notify_push/releases/download/v0.6.0/notify_push-v0.6.0.tar.gz",
135+ "version": "0.6.0",
136 "description": "Push update support for desktop app.\n\nOnce the app is installed, the push binary needs to be setup. You can either use the setup wizard with `occ notify_push:setup` or see the [README](http://github.com/nextcloud/notify_push) for detailed setup instructions",
137 "homepage": "",
138 "licenses": [
+3-3
pkgs/servers/nextcloud/packages/25.json
···110 ]
111 },
112 "notify_push": {
113- "sha256": "1raxkzdcd9mixg30ifv22lzf10j47n79n05yqbf6mjagrgj0rr7f",
114- "url": "https://github.com/nextcloud/notify_push/releases/download/v0.5.0/notify_push.tar.gz",
115- "version": "0.5.0",
116 "description": "Push update support for desktop app.\n\nOnce the app is installed, the push binary needs to be setup. You can either use the setup wizard with `occ notify_push:setup` or see the [README](http://github.com/nextcloud/notify_push) for detailed setup instructions",
117 "homepage": "",
118 "licenses": [
···110 ]
111 },
112 "notify_push": {
113+ "sha256": "1vfa68spnyfivcx0vp49mimf5xg7hsxnifd06imd1c0mw3nlfm4p",
114+ "url": "https://github.com/nextcloud-releases/notify_push/releases/download/v0.6.0/notify_push-v0.6.0.tar.gz",
115+ "version": "0.6.0",
116 "description": "Push update support for desktop app.\n\nOnce the app is installed, the push binary needs to be setup. You can either use the setup wizard with `occ notify_push:setup` or see the [README](http://github.com/nextcloud/notify_push) for detailed setup instructions",
117 "homepage": "",
118 "licenses": [
···56buildGoModule rec {
7 pname = "trivy";
8- version = "0.37.2";
910 src = fetchFromGitHub {
11 owner = "aquasecurity";
12 repo = pname;
13 rev = "v${version}";
14- sha256 = "sha256-k5S0ttOhI+vjiGJpIPVi9ro6n3f2Cxe7HiADvs14Zuo=";
15 };
16 # hash missmatch on across linux and darwin
17 proxyVendor = true;
18- vendorSha256 = "sha256-EJw5DxiBF+gw5X+vqrnZsNCm2umOHEq6GeQ5V/Z0DrE=";
1920 excludedPackages = "misc";
21
···56buildGoModule rec {
7 pname = "trivy";
8+ version = "0.37.3";
910 src = fetchFromGitHub {
11 owner = "aquasecurity";
12 repo = pname;
13 rev = "v${version}";
14+ sha256 = "sha256-fndA2rApDXwKeQEQ9Vy/9iJBJPcRWt+yJfvRdNDOwZU=";
15 };
16 # hash missmatch on across linux and darwin
17 proxyVendor = true;
18+ vendorHash = "sha256-91UPIz5HM82d6s8kHEb9w/vLQgXmoV8fIcbRyXDMNL8=";
1920 excludedPackages = "misc";
21
+1-1
pkgs/tools/audio/tts/default.nix
···95 # cython modules are not installed for some reasons
96 (
97 cd TTS/tts/utils/monotonic_align
98- ${python.interpreter} setup.py install --prefix=$out
99 )
100 '';
101
···95 # cython modules are not installed for some reasons
96 (
97 cd TTS/tts/utils/monotonic_align
98+ ${python.pythonForBuild.interpreter} setup.py install --prefix=$out
99 )
100 '';
101