Compare commits

...

411 Commits

Author SHA1 Message Date
dependabot[bot]
6f3bc5cc4c
chore(deps): bump golang.org/x/net from 0.40.0 to 0.41.0 in the go-all group (#2632)
chore(deps): bump golang.org/x/net in the go-all group

Bumps the go-all group with 1 update: [golang.org/x/net](https://github.com/golang/net).


Updates `golang.org/x/net` from 0.40.0 to 0.41.0
- [Commits](https://github.com/golang/net/compare/v0.40.0...v0.41.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-version: 0.41.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: go-all
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-11 15:38:14 +02:00
dependabot[bot]
c2caf9c7d7
chore(deps): bump github.com/leonelquinteros/gotext from 1.7.1 to 1.7.2 in the go-all group (#2629)
chore(deps): bump github.com/leonelquinteros/gotext in the go-all group

Bumps the go-all group with 1 update: [github.com/leonelquinteros/gotext](https://github.com/leonelquinteros/gotext).


Updates `github.com/leonelquinteros/gotext` from 1.7.1 to 1.7.2
- [Release notes](https://github.com/leonelquinteros/gotext/releases)
- [Commits](https://github.com/leonelquinteros/gotext/compare/v1.7.1...v1.7.2)

---
updated-dependencies:
- dependency-name: github.com/leonelquinteros/gotext
  dependency-version: 1.7.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: go-all
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-04 23:45:08 +02:00
transifex-integration[bot]
8615239329
Updates for file po/en.po in zh_CN (#2628)
Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-05-28 14:07:22 +02:00
transifex-integration[bot]
c9a8507654
Updates for file po/en.po in ko (#2626)
* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-05-27 14:59:55 +02:00
dependabot[bot]
75e90c3a6d
chore(deps): bump the go-all group with 2 updates (#2622)
Bumps the go-all group with 2 updates: [golang.org/x/net](https://github.com/golang/net) and [golang.org/x/term](https://github.com/golang/term).


Updates `golang.org/x/net` from 0.39.0 to 0.40.0
- [Commits](https://github.com/golang/net/compare/v0.39.0...v0.40.0)

Updates `golang.org/x/term` from 0.31.0 to 0.32.0
- [Commits](https://github.com/golang/term/compare/v0.31.0...v0.32.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-version: 0.40.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: go-all
- dependency-name: golang.org/x/term
  dependency-version: 0.32.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: go-all
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-27 12:15:33 +02:00
Jo
8ab3652846
Fix locale initialization logic (#2619)
* Fix locale initialization logic

* add comment

* Update main.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update GitHub Actions workflow for testing

* Update GitHub Actions workflow to allow artifact overwrite

* Update GitHub Actions workflow to allow artifact overwrite

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-06 18:16:47 +02:00
dependabot[bot]
98be3fec97
chore(deps): bump golang.org/x/sys from 0.32.0 to 0.33.0 in the go-all group (#2620)
chore(deps): bump golang.org/x/sys in the go-all group

Bumps the go-all group with 1 update: [golang.org/x/sys](https://github.com/golang/sys).


Updates `golang.org/x/sys` from 0.32.0 to 0.33.0
- [Commits](https://github.com/golang/sys/compare/v0.32.0...v0.33.0)

---
updated-dependencies:
- dependency-name: golang.org/x/sys
  dependency-version: 0.33.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: go-all
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-06 18:08:17 +02:00
Jo
c6a2226ce1
add locale tooling (#2618) 2025-05-01 22:07:39 +00:00
transifex-integration[bot]
3e82496057
Updates for file po/en.po in hu (#2617)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-05-01 23:08:02 +02:00
Jo
a26ac1ba95
Add SECURITY.md with contact and supported version policy (#2616)
* docs: add SECURITY.md with contact and supported version policy

* fix image render
2025-05-01 08:18:31 +00:00
Jo
b745f87210
chore(ci): update golangci lint v2.1.5 (#2615)
* chore(ci): update golangci-lint to v2.1.5 in ci.Dockerfile

* add golangci

* fix lint

* fix lint pkg/upgrade

* reenable lint
2025-05-01 10:00:10 +02:00
dependabot[bot]
b4a41700ee
chore(deps): bump golang.org/x/net from 0.38.0 to 0.39.0 (#2609)
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.38.0 to 0.39.0.
- [Commits](https://github.com/golang/net/compare/v0.38.0...v0.39.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-version: 0.39.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-01 09:58:03 +02:00
dependabot[bot]
0aa80e521e
chore(deps): bump github.com/stretchr/testify from 1.9.0 to 1.10.0 (#2612)
Bumps [github.com/stretchr/testify](https://github.com/stretchr/testify) from 1.9.0 to 1.10.0.
- [Release notes](https://github.com/stretchr/testify/releases)
- [Commits](https://github.com/stretchr/testify/compare/v1.9.0...v1.10.0)

---
updated-dependencies:
- dependency-name: github.com/stretchr/testify
  dependency-version: 1.10.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-01 09:55:03 +02:00
Jo
bce9f2fc72
Update dependabot.yml (#2614) 2025-05-01 09:54:47 +02:00
dependabot[bot]
5a71db2526
chore(deps): bump github.com/deckarep/golang-set/v2 from 2.7.0 to 2.8.0 (#2611)
Bumps [github.com/deckarep/golang-set/v2](https://github.com/deckarep/golang-set) from 2.7.0 to 2.8.0.
- [Release notes](https://github.com/deckarep/golang-set/releases)
- [Commits](https://github.com/deckarep/golang-set/compare/v2.7.0...v2.8.0)

---
updated-dependencies:
- dependency-name: github.com/deckarep/golang-set/v2
  dependency-version: 2.8.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-01 08:40:49 +02:00
dependabot[bot]
535370bca0
chore(deps): bump golang.org/x/term from 0.30.0 to 0.31.0 (#2613)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.30.0 to 0.31.0.
- [Commits](https://github.com/golang/term/compare/v0.30.0...v0.31.0)

---
updated-dependencies:
- dependency-name: golang.org/x/term
  dependency-version: 0.31.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-01 08:37:38 +02:00
Jo
9a3be07887
Create dependabot.yml (#2608)
* Create dependabot.yml

* Update dependabot.yml

* Update dependabot.yml
2025-04-30 21:49:38 +00:00
Jo
2f205ee96c
chore(ci): update golangci-lint to v2.1.5 in ci.Dockerfile (#2607) 2025-04-30 21:45:29 +00:00
Jo
1982ce0366
chore(devcontainer): add devcontainer for clean builds (#2606)
chore(devcontainer): set up passwordless sudo for docker user for clean development and testing
2025-04-30 23:08:02 +02:00
Jo
2dcf94544c
refactor(build): optimize allocations and add tests (#2601)
* perf(build): optimize map/slice allocations and use strings.Builder

* test(build): add tests for parsePackageList
2025-04-29 22:45:24 +02:00
Ferdinand Bachmann
95fc0938fd
fix(installer): Fix the same pkgbase being built multiple times on error (#2561)
fix(installer): Fixes the same pkgbase being built multiple times on error

The previous fix commit ec837c8 failed to address the case where the
build fails, as packages are only added to builtPkgDests on a successful
build. This commit addresses this by adding the package to the map earlier.

Fixes #2560.
2025-04-29 22:45:10 +02:00
transifex-integration[bot]
ff176c0dd2
Updates for file po/en.po in hu (#2605)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-04-28 10:17:56 +02:00
Jo
bf315041b1
refactor(upgrade): optimize code and improve maintainability (#2600)
* refactor(upgrade): reduce code duplication in Print methods and fix typo in Filter documentation

* refactor(upgrade): optimize UserExcludeUpgrades with early return and simplify GraphUpgrades function
2025-04-24 12:06:27 +02:00
dependabot[bot]
559bc06b31
chore(deps): bump golang.org/x/net from 0.37.0 to 0.38.0 (#2598)
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.37.0 to 0.38.0.
- [Commits](https://github.com/golang/net/compare/v0.37.0...v0.38.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-version: 0.38.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-17 11:41:44 +02:00
transifex-integration[bot]
35019f95b6
Updates for file po/en.po in nl (#2592)
* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-03-20 15:11:04 +01:00
transifex-integration[bot]
50cbf70bf4
Updates for file po/en.po in zh_TW (#2591)
* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-03-20 15:10:57 +01:00
Jo
0b5f5f0ccd
fix(main): return 1 exit code on panic (#2590) 2025-03-09 22:59:50 +01:00
Jo
a300330b94
fix(ci): prefer ghcr image (#2589)
prefer ghcr image
2025-03-09 22:50:58 +01:00
Jo
670598912e
fix: correct Docker manifest creation in builder image workflow (#2588)
* fix: correct Docker manifest creation in builder image workflow

- Fix "invalid reference format" error by properly separating Docker Hub and
  GitHub Container Registry tags
- Use short SHA format instead of long format for more manageable tags
- Improve manifest list creation process to handle multiple registries correctly
- Ensure proper handling of ghcr.io prefix for GitHub Container Registry

* update golangci and comment out community
2025-03-09 22:20:50 +01:00
Jo
257b230e39
chore(yay): update deps (#2587)
update deps
2025-03-09 20:45:10 +01:00
Jo
d2c67ae0a4
fix(yay): fix minor performance and safety issue (#2581) 2025-02-25 17:15:34 +01:00
Jo
4432c60246 fix(ci): fix issue with builder image CI 2025-02-20 17:16:35 +00:00
Jo
33ba07fe0d
fix(yay): update go mod (#2580)
update go mod
2025-02-17 15:54:21 +01:00
Dominik Link
d37e365ac3
fixed issue #2471 where it wouldn't show info when packages are in a group (#2576)
fixed issue #2471 where it wouldn't show info when packages are in a group and modified test
2025-02-17 15:28:35 +01:00
transifex-integration[bot]
6807ecc081
Updates for file po/en.po in zh_TW (#2571)
* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

* Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-02-14 14:36:32 +01:00
transifex-integration[bot]
b07d8c1447
Updates for file po/en.po in de (#2572)
Translate po/en.po in de

100% translated source file: 'po/en.po'
on 'de'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-02-14 14:36:13 +01:00
transifex-integration[bot]
3da808847f
Updates for file po/en.po in ru_RU (#2574)
Translate po/en.po in ru_RU

100% translated source file: 'po/en.po'
on 'ru_RU'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-02-05 12:01:03 +01:00
transifex-integration[bot]
590a3d3a8c
Updates for file po/en.po in ru (#2575)
Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-02-05 12:00:33 +01:00
Ikko Eltociear Ashimine
69685d0fb5
docs: update README.md (#2564)
minor fix
2025-01-23 14:42:52 +01:00
transifex-integration[bot]
5572d1817e
Updates for file po/en.po in zh_TW (#2569)
Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-15 11:11:08 +01:00
transifex-integration[bot]
2f403a4f28
Updates for file po/en.po in zh_CN (#2568)
* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-15 11:09:49 +01:00
transifex-integration[bot]
9322197d0c
Updates for file po/en.po in ca_ES (#2556)
Translate po/en.po in ca_ES

100% translated source file: 'po/en.po'
on 'ca_ES'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-03 16:06:02 +01:00
transifex-integration[bot]
46f3842e6f
Updates for file po/en.po in ca (#2555)
Translate po/en.po in ca

100% translated source file: 'po/en.po'
on 'ca'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-03 14:55:04 +01:00
transifex-integration[bot]
76000ae987
Updates for file po/en.po in fr (#2557)
* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-03 14:54:45 +01:00
transifex-integration[bot]
e27979d21d
Updates for file po/en.po in fr_FR (#2558)
* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2025-01-03 14:54:16 +01:00
transifex-integration[bot]
c0baacd633
Updates for file po/en.po in sv (#2559)
Translate po/en.po in sv

100% translated source file: 'po/en.po'
on 'sv'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-30 10:57:45 +00:00
transifex-integration[bot]
43567b5d85
Updates for file po/en.po in it_IT (#2548)
Translate po/en.po in it_IT

100% translated source file: 'po/en.po'
on 'it_IT'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-23 14:22:27 +00:00
transifex-integration[bot]
e18cc87307
Updates for file po/en.po in hu (#2549)
Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-23 14:22:06 +00:00
transifex-integration[bot]
81a2a19101
Updates for file po/en.po in id (#2550)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-22 15:28:42 +00:00
transifex-integration[bot]
669d7af6d1
Updates for file po/en.po in nl (#2551)
Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-22 15:27:33 +00:00
transifex-integration[bot]
65ce4b9f6f
Updates for file po/en.po in pt (#2553)
Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-22 15:27:04 +00:00
transifex-integration[bot]
5f2b94ce7c
Updates for file po/en.po in pt_BR (#2552)
* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-22 15:26:31 +00:00
dmitrodem
3f2f6eae31
socks5 support (#2543)
* socks5 support

socks5 support via environment variable, e.g. SOCKS5_PROXY=localhost:1080 yay ...

* use default transport and update tests to work on arm
2024-12-20 16:49:05 +00:00
scarf
da53d3855f
fix: use positional argument for "providers available" translation (#2537) 2024-12-20 16:43:28 +00:00
Matthias Kurz
f23fe98a66
Fix comments: AUR_SEEN is the correct name (#2535)
Fix comments: AUR_SEEN is the correct name
2024-11-21 12:00:11 +01:00
Ferdinand Bachmann
ec837c831d
fix(installer): Fixes the same pkgbase being built multiple times (#2534)
* fix(installer): Fixes the same pkgbase being built multiple times

When building a PKGBUILD pkgbase with multiple pkgnames,
installAURPackages() invokes buildPkg() multiple times for the same
pkgbase. This causes prepare() to be run multiple times for the same
pkgbase, since detection of already built packages happens after
prepare().

Additionally, detection of already built packages can fail if the split
debug packages are enabled and the package does not contain any
binaries, causing no -debug package to be created by makepkg even though
it is listed by makepkg --packagelist.

This commit fixes this by keeping track of the pkgdests built by
buildPkg() and avoiding rebuilds of the same pkgbase in the same yay
invocation.

Fixes #2340.

Signed-off-by: Ferdinand Bachmann <ferdinand.bachmann@yrlf.at>

* fix(installer): Fixes buildPkg() isTarget param being order-dependent

Previously, the buildPkg invocation for a pkgbase only considered
whether the current pkgname is part of installer.origTargets. This made
the decision whether to rebuild the package order-dependent.

This commit fixes this by keeping track of which pkgbases are part of
installer.origTargets and rebuilding the pkgbase if any of its pkgnames
is part of origTargets.

* fix(tests): Test that installing split packages avoids rebuilds

The previous two commits changed how split packages (packages with the same
pkgbase) are built, ensuring that those packages aren't built multiple
times.

This commit updates the lists of commands that the tests expect to be
run so that `makepkg` isn't run multiple times per pkgbase.

---------

Signed-off-by: Ferdinand Bachmann <ferdinand.bachmann@yrlf.at>
2024-11-19 11:08:28 +01:00
transifex-integration[bot]
f100c1d54b
Updates for file po/en.po in uk (#2533)
Translate po/en.po in uk

100% translated source file: 'po/en.po'
on 'uk'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-11-15 14:52:12 +01:00
April Hall
3f5d26c4f9
Replace "⚠️ {warning}" with Github Warning Blocks (#2531)
Update README.md
2024-11-12 20:03:03 +00:00
Alice Jacka
3003f1667c
Add missing sudo to binary install instructions (#2525) 2024-10-25 11:12:49 +02:00
Anmol W
fb168fb176
fix swapped fish completion descriptions for --repo and --aur (#2523)
fix swapped fish completions for --repo and --aur
2024-10-25 10:35:07 +02:00
transifex-integration[bot]
842067256b
Updates for file po/en.po in id (#2522)
* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

* Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-10-25 10:30:32 +02:00
transifex-integration[bot]
a6a6dc0acb
Updates for file po/en.po in hu (#2521)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-10-25 10:30:21 +02:00
transifex-integration[bot]
2e06552211
Updates for file po/en.po in zh_CN (#2524)
* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-10-25 10:30:11 +02:00
transifex-integration[bot]
138c2dd6cd
Updates for file po/en.po in sk (#2516)
Translate po/en.po in sk

100% translated source file: 'po/en.po'
on 'sk'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-30 09:32:30 +02:00
transifex-integration[bot]
4872b8b829
Updates for file po/en.po in pl_PL (#2513)
Translate po/en.po in pl_PL

100% translated source file: 'po/en.po'
on 'pl_PL'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-27 10:33:58 +02:00
transifex-integration[bot]
1b6ad7b305
Updates for file po/en.po in hu (#2512)
Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-27 10:33:51 +02:00
transifex-integration[bot]
d6e961af70
Updates for file po/en.po in pl (#2514)
Translate po/en.po in pl

100% translated source file: 'po/en.po'
on 'pl'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-27 10:33:41 +02:00
Jo
89b32ee9ce
fix(parser): allow disable-sandbox option. fixes #2509 (#2511) 2024-09-24 08:12:41 +00:00
transifex-integration[bot]
f68a57129f
Updates for file po/en.po in hu (#2510)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-24 09:57:49 +02:00
Absobel
bea53a4a09
fix: added missing spaces (#2386) 2024-09-19 14:15:44 +02:00
Rodrigo Barrios
13df7e79eb
Document -Ycc in manpage (#2482)
Signed-off-by: r.b4rr10s <rodrigoedu11@gmail.com>
2024-09-19 14:14:16 +02:00
Tim Konick
0f496c9af9
Fixes syntax error #2506 (#2507)
* Fixes syntax error

* Use escape-chars instead
2024-09-18 07:46:51 +00:00
Jo
84d8f1b7b3
chore: update default po (#2505)
update default po
2024-09-17 16:50:02 +00:00
Jo
3a118b7690
chore: fix failing build actions (#2503) 2024-09-17 17:14:26 +02:00
transifex-integration[bot]
a32f5e7e2c
Updates for file po/en.po in nl (#2504)
* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-17 14:01:14 +02:00
Oliver Tzeng
3c881d577f
fix(zh_TW.po): fixed a lot of simplified chinese (#2498)
- Changelog
 * Added myself as the initial translator, got removed for some
   reason(#1776)
 * Converted simplified to traditional Chinese
 * Formatting
2024-09-14 17:52:30 +02:00
transifex-integration[bot]
86f5c08ec4
Updates for file po/en.po in he (#2491)
* Translate po/en.po in he

100% translated source file: 'po/en.po'
on 'he'.

* Translate po/en.po in he

100% translated source file: 'po/en.po'
on 'he'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-14 17:20:59 +02:00
Stephan Burns
675f0ba3f3
Add sudo to install instructions so it can be run without user switching (#2483)
* Add sudo to install instructions so it can be run without user switching

* Add sudo to one liner and a note for people using different root tools
2024-09-02 15:21:57 +02:00
transifex-integration[bot]
d7d67e3fd3
Updates for file po/en.po in hu (#2490)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-09-02 15:21:04 +02:00
transifex-integration[bot]
c28be1d8b0
Updates for file po/en.po in ru_RU (#2481)
* Translate po/en.po in ru_RU

100% translated source file: 'po/en.po'
on 'ru_RU'.

* Translate po/en.po in ru_RU

100% translated source file: 'po/en.po'
on 'ru_RU'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-08-07 09:56:31 +02:00
iTrooz
1b8f823f7c
Refactor part of UserExcludeUpgrades() (#2450) 2024-07-29 09:15:20 +02:00
Daniel Oh
836fc5922a
add short option for --repo (#2380)
* add short option for --repo

* run pre-commit

* update man page

* add fish completion

* add a N options

* add long options

---------

Co-authored-by: jguer <me@jguer.space>
2024-06-28 16:40:35 +02:00
Joey Holtzman
0165486bf4
Respect provided targets when using -Si flag (#2460) 2024-06-28 16:39:49 +02:00
Jo
5149e3714d
fix(query): match empty pacman -Si with AUR info (#2459) 2024-06-21 10:06:02 +02:00
Marcus B Spencer
9ed9b0b4e1
Update README.md (#2458)
Update the README to prevent partial upgrades.

Partial upgrades are dangerous on Arch Linux.
Partial upgrades may occur when a `-y` operation is given without a corresponding `-u` operation.

Quote from the ArchWiki:
> Do not use:
> `pacman -Sy package`
> `pacman -Sy` followed by `pacman -S` package (Note the absence of `-S**u**` in the installation of the package.)
> `pacman -Syuw` (Note that `pacman -Syuw` does imply the same risks like pacman -Sy`, as it will update the pacman sync database without installing the newer packages.)

https://wiki.archlinux.org/title/System_maintenance#Partial_upgrades_are_unsupported
2024-06-20 14:15:14 +02:00
transifex-integration[bot]
e19700234f
Updates for file po/en.po in he (#2447)
* Translate po/en.po in he

100% translated source file: 'po/en.po'
on 'he'.

* Translate po/en.po in he

100% translated source file: 'po/en.po'
on 'he'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-06-03 09:56:18 +02:00
transifex-integration[bot]
965f8956e9
Updates for file po/en.po in he_IL (#2448)
Translate po/en.po in he_IL

100% translated source file: 'po/en.po'
on 'he_IL'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-06-03 09:38:39 +02:00
transifex-integration[bot]
53c9d0ef34
Updates for file po/en.po in es (#2441)
* Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

* Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-05-13 09:23:23 +02:00
Adivardhan Maheshwari
9b41f136d6
Add a package update command (#2205)
* Add a package update command

when running these commands on archlinux-base proxmox lxc, found out that these commands return 404 error unless the pacman -Sy command is run.

This although seeming simplistic might be something unknown when someone uses yay and archlinux for the first time.

* Update README.md
2024-05-02 11:00:29 +00:00
Tom
d956dd7888
Fix incorrect formatting of some msgstr of the spanish translation (#2437)
fix: Fix incorrect formatting for "%s to upgrade/install." and "%s will also be installed for this operation." in Spanish translation
2024-05-02 12:58:59 +02:00
eveneast
61dd708a4a
chore: fix function name in comment (#2430)
Signed-off-by: eveneast <qcqs@foxmail.com>
2024-05-02 10:54:16 +00:00
transifex-integration[bot]
ff3ad18fa8
Updates for file po/en.po in fr (#2425)
* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

* Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-05-02 12:53:37 +02:00
transifex-integration[bot]
803f708106
Updates for file po/en.po in fr_FR (#2424)
* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-05-02 12:53:29 +02:00
transifex-integration[bot]
9b6d40d7f9
Updates for file po/en.po in hu (#2431)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-05-02 12:53:20 +02:00
transifex-integration[bot]
779a9f16bd
Updates for file po/en.po in fr_FR (#2423)
* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-04-03 14:32:00 +02:00
transifex-integration[bot]
02d3e2e1c0
Updates for file po/en.po in ru_RU (#2422)
Translate po/en.po in ru_RU

100% translated source file: 'po/en.po'
on 'ru_RU'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-04-03 14:31:51 +02:00
transifex-integration[bot]
8de397ed11
Updates for file po/en.po in fr (#2419)
Translate po/en.po in fr

100% translated source file: 'po/en.po'
on 'fr'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-04-03 10:37:48 +02:00
transifex-integration[bot]
de7ad4070f
Updates for file po/en.po in zh_TW (#2420)
Translate po/en.po in zh_TW

100% translated source file: 'po/en.po'
on 'zh_TW'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-04-03 10:37:41 +02:00
transifex-integration[bot]
3b18e2197c
Updates for file po/en.po in ca_ES (#2421)
Translate po/en.po in ca_ES

100% translated source file: 'po/en.po'
on 'ca_ES'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-04-03 10:37:33 +02:00
transifex-integration[bot]
127b3a5b1a
Updates for file po/en.po in fr_FR (#2414)
* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

* Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-03-31 16:53:36 +02:00
Jo
2ff794da32
chore: update go-alpm (#2409)
* update go alpm

* update golangci

* add missing ldflag
2024-03-23 23:46:02 +01:00
James Raspass
6c2330528f
Use Go 1.21's min/max built-ins (#2405)
This simplifies the code compared to either rolling our own or awkwardly
using math's float functions with integers.
2024-03-23 23:27:30 +01:00
transifex-integration[bot]
a1d530cbf4
Updates for file po/en.po in hu (#2401)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-03-17 14:14:46 +01:00
James Raspass
5d887cbd41
Replace github.com/pkg/errors with stdlib errors/fmt (#2400) 2024-03-16 09:10:49 +01:00
Jo
05b76852bd
chore(yay): fix missing command (#2399)
fix missing command
2024-03-16 01:22:48 +01:00
Jo
9524cbbaed
chore: ensure pacman update in builder (#2398)
* ensure pacman up

* lint step fix
2024-03-16 01:13:30 +01:00
Jo
48d1d3d2d5
chore(yay): update dependencies and builder (#2396)
* fix libalpm14 build

* update deps

* try alternative arch
2024-03-16 00:55:05 +01:00
transifex-integration[bot]
741d83c1f0
Updates for file po/en.po in fr_FR (#2388)
Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-03-15 23:30:13 +01:00
transifex-integration[bot]
9c02af429a
Updates for file po/en.po in ru (#2389)
* Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

* Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

* Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

* Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-03-11 15:27:27 +01:00
Jo
03e89d660f
chore: fix min go version (#2383)
fix min go version
2024-03-02 16:40:21 +01:00
transifex-integration[bot]
f7f2169992
Updates for file po/en.po in nl (#2377)
* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

* Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-02-23 08:28:24 +01:00
Jo
e8080f87c2
chore(yay): fix breaking -git ci test (#2373)
* chore(yay): fix breaking test

* chore(yay): fix breaking test

* chore(yay): fix breaking test

* update gomod

* remove debug commands
2024-02-19 17:04:06 +01:00
Jo
26aa171b2b
fix(query): remove -debug packages from missing list if base package is installed (#2372)
* chore(yay): fix pre-commit

* chore(yay): fix git ignore
2024-02-19 10:29:47 +00:00
transifex-integration[bot]
92d7cb0faa
Updates for file po/en.po in id (#2366)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-02-02 09:41:56 +01:00
transifex-integration[bot]
f3a4fc8987
Updates for file po/en.po in cs (#2367)
Translate po/en.po in cs

100% translated source file: 'po/en.po'
on 'cs'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-02-02 09:41:48 +01:00
transifex-integration[bot]
aa6cad75a3
Updates for file po/en.po in ca (#2353)
Translate po/en.po in ca

100% translated source file: 'po/en.po'
on 'ca'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:44:40 +01:00
transifex-integration[bot]
2078bc936f
Updates for file po/en.po in es (#2354)
Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:44:23 +01:00
transifex-integration[bot]
d778be4f9f
Updates for file po/en.po in de (#2352)
Translate po/en.po in de

100% translated source file: 'po/en.po'
on 'de'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:44:06 +01:00
transifex-integration[bot]
aeafe23027
Updates for file po/en.po in hu (#2355)
Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:43:51 +01:00
transifex-integration[bot]
6c31477ccd
Updates for file po/en.po in pt_BR (#2357)
Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:43:39 +01:00
transifex-integration[bot]
0a930c9ffc
Updates for file po/en.po in id (#2358)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:43:15 +01:00
transifex-integration[bot]
d411524481
Updates for file po/en.po in pt (#2356)
Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:42:51 +01:00
transifex-integration[bot]
15ef062bb5
Updates for file po/en.po in zh_CN (#2359)
Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:42:28 +01:00
transifex-integration[bot]
c2d7d99e43
Updates for file po/en.po in it_IT (#2360)
Translate po/en.po in it_IT

100% translated source file: 'po/en.po'
on 'it_IT'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:42:20 +01:00
transifex-integration[bot]
86207fce64
Updates for file po/en.po in sv (#2361)
Translate po/en.po in sv

100% translated source file: 'po/en.po'
on 'sv'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-25 16:42:00 +01:00
Jo
dc68b1a8fa
chore: update locale files (#2351)
update po
2024-01-25 15:16:21 +00:00
Joey Holtzman
d02c45e5b6
Remove deprecated flags in favor of boolean flags (#2350) 2024-01-25 16:03:47 +01:00
Joey Holtzman
8d773aa6a3
Remove deprecated warning when using -yc together (#2347) 2024-01-24 16:11:51 +01:00
transifex-integration[bot]
b81d34d5cc
Updates for file po/en.po in zh_CN (#2341)
Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-18 15:10:57 +01:00
transifex-integration[bot]
d77dd77141
Updates for file po/en.po in hu (#2338)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-01-04 17:51:26 +01:00
transifex-integration[bot]
e34bce003d
Updates for file po/en.po in hu (#2337)
* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

* Translate po/en.po in hu

100% translated source file: 'po/en.po'
on 'hu'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-12-27 16:59:38 +01:00
Eng Zer Jun
d23e8925fa
fix(dep_graph): fix panic on selecting AUR providers (#2333)
This commit fixes https://github.com/Jguer/yay/issues/2289 by making
`provideMenu` returns the first option when it receives an error input
from the user (e.g. user sends EOF).

Signed-off-by: Eng Zer Jun <engzerjun@gmail.com>
2023-12-11 17:45:54 +01:00
Jo
643830fccd
chore: Use custom image (#2328)
* self-host image

* update deps
2023-11-30 09:39:56 +01:00
transifex-integration[bot]
cb4cd7b451
Updates for file po/en.po in sv (#2321)
Translate po/en.po in sv

100% translated source file: 'po/en.po'
on 'sv'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:17:30 +01:00
transifex-integration[bot]
350ff1c70a
Updates for file po/en.po in id (#2323)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:17:16 +01:00
transifex-integration[bot]
26dc74ed67
Updates for file po/en.po in ca (#2320)
Translate po/en.po in ca

100% translated source file: 'po/en.po'
on 'ca'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:16:57 +01:00
transifex-integration[bot]
a00ff6b3cc
Updates for file po/en.po in de (#2324)
Translate po/en.po in de

100% translated source file: 'po/en.po'
on 'de'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:16:43 +01:00
transifex-integration[bot]
9e665a98b9
Updates for file po/en.po in pt_BR (#2325)
* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

* Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:16:30 +01:00
transifex-integration[bot]
4169f0ee42
Updates for file po/en.po in es (#2326)
* Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

* Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

* Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-29 11:16:15 +01:00
transifex-integration[bot]
5a6b18fe23
Updates for file po/en.po in pt (#2322)
* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-23 16:34:20 +01:00
transifex-integration[bot]
c8577bb613
Updates for file po/en.po in pt_BR (#2317)
Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-21 14:10:18 +01:00
transifex-integration[bot]
965f41b938
Updates for file po/en.po in it_IT (#2316)
Translate po/en.po in it_IT

100% translated source file: 'po/en.po'
on 'it_IT'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-21 14:10:06 +01:00
Kirill Motkov
0771ded99b
Refactor AURPKGBUILDRepos and Fix Localization Command in Makefile (#2313)
* optimize mutex usage and logging in AURPKGBUILDRepos function

* fix localization script

Fix localization script Makefile by adding missing semicolon and --no-translator flag
2023-11-20 16:58:20 +01:00
transifex-integration[bot]
8f98ab3d4b
Updates for file po/en.po in de (#2312)
Translate po/en.po in de

100% translated source file: 'po/en.po'
on 'de'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-11-15 16:43:40 +01:00
Joey Holtzman
d368f99be0
feat(yay): add boolean flags (#2285)
* feat(yay): add boolean flags

This feature now allows users to specify --<option>=<bool value> instead
of using --<option> and --no<option>. Specifying nothing results in the
boolean value being true. The flags prefixed with `no` are deprecated.

* chore(args): Print warning when using deprecated flags

* chore(yay): Update man page for deprecated options and add examples of
boolean flags being used
2023-09-27 12:36:57 +02:00
Eng Zer Jun
a1121556be
refactor: remove redundant len check (#2291)
`len` returns 0 if the slice or map is nil. From the Go specification
[1]:

  "1. For a nil slice, the number of iterations is 0."
  "3. If the map is nil, the number of iterations is 0."

Therefore, an additional `len(v) != 0` check for before the loop is
unnecessary.

[1]: https://go.dev/ref/spec#For_range

Signed-off-by: Eng Zer Jun <engzerjun@gmail.com>
2023-09-27 12:36:08 +02:00
Jo
299aa1e123
fix(download): do not garble download output by default (#2283)
limit default concurrent downloads to 1
2023-09-18 09:51:32 +02:00
Joey Holtzman
04c76a404e
feat(install): add --keepsrc to keep pkg/ and src/ directories (#2272)
* feat(install): add --nocleanbuild to keep pkg/ and src/ directories for
AUR packages

Providing this flag during installation of AUR packages allows for keeping
the src/ and pkg/ directories produced my makepkg. If the user wants to
delete the directories, they can either select to cleanBuild in the
cleanmenu or run the installation without the --nocleanbuild flag (yay
will only remove the directories if the package is rebuilt)

* fix(completion): simplify description for --nocleanbuild in fish

This makes the description consistent with the descriptions in the
man page, --help, and zsh completion.

* refactor(install): Rename --nocleanbuild to --keepsrc

This naming scheme is more familiar to users since it is the name of the
flag in Paru.

---------

Co-authored-by: jguer <me@jguer.space>
2023-09-18 09:21:42 +02:00
Jo
e60ccdf8b7
Fix image recursive build (#2280)
* fix image recursive build

* add test fixes

* disable buildvcs

* fix integration
2023-09-15 11:17:18 +02:00
transifex-integration[bot]
b6c72ce7a2
Updates for file po/en.po in he on branch next (#2274)
Translate po/en.po in he

100% translated source file: 'po/en.po'
on 'he'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-09-08 18:21:41 +02:00
Esteban Blanc
87d1fd1c06
Add oneliner install for binray source (#2268) 2023-09-07 10:58:30 +02:00
transifex-integration[bot]
92d50910de
Updates for file po/en.po in ko on branch next (#2266)
* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-09-07 10:44:11 +02:00
Joey Holtzman
a0e6838a5f
fix(yay): fix missing placeholders in translations (#2271) 2023-09-07 10:43:45 +02:00
Jo
8916cd174b
refactor(yay): move cfg inside of runtime (#2259)
* rework relationship between runtime and cfg

* separate runtime from cfg

* simplify instantiation logic

* move installer to appropriate package

* move operator to sync package

* add tests for srcinfo service

* consolidate srcinfo service in sync

* add logger to srcinfo

* add logger to preparer

* remove unused text functions

* remove remaining text.* from srcinfo

* remove global logger parts

* remove global org method exports

* remove global logger

* move text->input

* add rule to prevent fmt.Print

* update golangci go version

* remove outdated FAQs

* remove outdated FAQs
2023-08-06 21:39:41 +02:00
Jo
7483393377
fix(sync): add missing Replaces to -Si (#2257) 2023-08-02 07:45:14 +00:00
jguer
9aefb8440e
add missing login 2023-07-30 23:16:30 +02:00
Jo
6c1998f6eb
chore(ci): update builder image ci (#2254) 2023-07-30 22:07:28 +02:00
smolx
688434b242
chore(yay): remove unnecessary Graph initialization (#2251)
Just an additional correction to one of my commits.
2023-07-26 10:28:23 +02:00
Jo
5995e55ddb
chore(topo): move topo to where it's used (#2250)
move topo into dep
2023-07-23 20:20:05 +00:00
Jo
04c82b8112
chore(yay): replace custom set package with dep (#2249)
* replace string set with dep

* remove unused field

* remove custom string set package
2023-07-23 17:29:01 +00:00
transifex-integration[bot]
abd398a787
Updates for file po/en.po in nl on branch next (#2243)
Translate po/en.po in nl

100% translated source file: 'po/en.po'
on 'nl'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-07-16 09:08:11 +00:00
Jo
23b053bccf
fix(aur): only check gpg signature after gpg import . fixes #2165 (#2239)
only check gpg on install
2023-07-10 09:54:43 +02:00
Joey Holtzman
dadc8c0d98
Re-add functionality for Installed and NotInstalled options in the menus (#2233)
* fix(menus): Handle Installed and NotInstalled options correctly in the
menus

This functionality was temporarily removed. This commit adds that
functionality back.

* fix(tests): Mock InstalledRemotePackageNamesFn when necessary
2023-07-06 06:54:21 +00:00
smolx
6dd7933fbe
Fix handling targets with specified db (#2218)
* Fix handling targets with specified db

Handle it in a similar way to handling targets with unspecified db.

Also refactored GraphSyncPkg method to make stuff more DRY.

* update go-mod
2023-07-06 06:53:46 +00:00
Joey Holtzman
d9029face3
Don't upgrade repo packages when --aur is specified (#2234)
fix(aur_install): Don't upgrade repo packages when --aur is specified
2023-07-06 06:20:51 +00:00
transifex-integration[bot]
64f5c2b0a9
Updates for po/en.po in zh_CN (#2232)
* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

* Translate po/en.po in zh_CN

100% translated source file: 'po/en.po'
on 'zh_CN'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-07-06 06:19:41 +00:00
transifex-integration[bot]
93afb03738
Updates for po/en.po in pt (#2231)
* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

* Translate po/en.po in pt

100% translated source file: 'po/en.po'
on 'pt'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-26 08:33:32 +02:00
transifex-integration[bot]
0dcf911e99
Updates for po/en.po in it_IT (#2229)
Translate po/en.po in it_IT

100% translated source file: 'po/en.po'
on 'it_IT'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:46:09 +02:00
transifex-integration[bot]
2be57cb312
Updates for po/en.po in ko (#2226)
Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:59 +02:00
transifex-integration[bot]
f070cff9f9
Updates for po/en.po in sv (#2227)
Translate po/en.po in sv

100% translated source file: 'po/en.po'
on 'sv'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:52 +02:00
transifex-integration[bot]
c46f5d31cc
Updates for po/en.po in ru (#2225)
Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:47 +02:00
transifex-integration[bot]
74c1cdb254
Updates for po/en.po in id (#2224)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on 'id'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:37 +02:00
transifex-integration[bot]
79b03fdac1
Updates for po/en.po in pt_BR (#2223)
Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:30 +02:00
transifex-integration[bot]
5a3c3ae4d0
Updates for po/en.po in fr_FR (#2228)
Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:45:04 +02:00
transifex-integration[bot]
710ff0097a
Updates for po/en.po in es (#2222)
Translate po/en.po in es

100% translated source file: 'po/en.po'
on 'es'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:44:34 +02:00
transifex-integration[bot]
ddeaf47a53
Updates for po/en.po in ca (#2221)
Translate po/en.po in ca

100% translated source file: 'po/en.po'
on 'ca'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-23 20:44:23 +02:00
Jo
4f7b3edefe
chore(yay): update deps (#2220)
* update deps

* update po
2023-06-23 20:15:47 +02:00
Jo
5b8cc98afa
fix(dep_graph): do not package provides when providing itself. fix: #2215 (#2216)
fix #2215. package provides itself
2023-06-19 08:13:47 +00:00
transifex-integration[bot]
e25d00015a
Updates for po/en.po in ru (#2212)
Translate po/en.po in ru

100% translated source file: 'po/en.po'
on 'ru'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-17 21:02:31 +00:00
smolx
c5a18e5000
Fix install reason preservation when package is reinstalled (#2195) 2023-06-11 16:16:05 +00:00
smolx
adde043514
Add --askyesremovemake option (#2199)
Same as --askremovemake option but with "Y" as a default answer.
2023-06-11 16:13:01 +00:00
smolx
7dc4fae155
Fix excluding of packages with unsatisfied deps (#2203)
* Fix excluding of packages with unsatisfied deps

When dependency is unsatisfied, add to the graph not only a dep node,
but relationship with parent too.

* Remove excess(duplicate) logic

* Add test cases of upgrading with unsatisfied deps
2023-06-11 16:10:48 +00:00
transifex-integration[bot]
599a5a9073
Updates for po/en.po in pt_BR (#2206)
Translate po/en.po in pt_BR

100% translated source file: 'po/en.po'
on 'pt_BR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-11 16:10:06 +00:00
smolx
12282fb28a
Fix sorting by name in package search (#2198)
Slightly reorganize and add more tests for SourceQueryBuilder.
2023-06-03 14:01:17 +00:00
smolx
0607090719
Fix deps resolving for installs from SRCINFO (#2190)
Also made MockBuilder and MockRunner thread-safe.
2023-06-03 13:53:33 +00:00
transifex-integration[bot]
1568e64d55
Updates for po/en.po in ko (#2201)
* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

* Translate po/en.po in ko

100% translated source file: 'po/en.po'
on 'ko'.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-03 13:50:17 +00:00
transifex-integration[bot]
d08f217b3a
Updates for po/en.po in fr_FR (#2202)
Translate po/en.po in fr_FR

100% translated source file: 'po/en.po'
on 'fr_FR'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-06-03 13:50:02 +00:00
transifex-integration[bot]
29f47a4413
Translations for po/en.po in es (#2197)
Translate po/en.po in es

100% translated source file: 'po/en.po'
on the 'es' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-30 08:43:35 +00:00
smolx
35ee42d343
Remove redundant attempts to set install reasons (#2196)
Redundant attempts took place when the installation by pacman was
exited with code 1, i.e. an real error occurred or installation
was simply cancelled.
2023-05-30 08:40:57 +00:00
moson-mo
1335e9b4e0
fix(pkgbuild): convert package name for gitlab URLs (#2191)
Convert package names to gitlab repo names

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-25 11:00:33 +02:00
Jo
e28319fece
chore(yay): lift legacy engine (#2189)
* remove legacy engine

* remove legacy dep handlers

* use prepare for gendb

* remove unused method

* remove aur client old interface

* remove unused menu fns

* remove inactive upgrademenu option

* unexport printInfo
2023-05-24 08:22:18 +00:00
smolx
c1aa71bee1
Fix AUR dependency resolving (#2169)
Before this fix dependencies for AUR targets were added to the graph
after each addition of a target node. Now dependencies are added only
after all target nodes are added to the graph.

Also added some tests for previously bugged cases.
2023-05-23 21:18:41 +00:00
smolx
56d1b7ed1c
Fix --rebuild option (#2163)
* Reimplement --rebuild option in the new engine (#2153)

* Refactor --rebuild option

* Fix comment formatting
2023-05-23 21:16:27 +00:00
transifex-integration[bot]
036a53882d
Translations for po/en.po in sv (#2188)
Translate po/en.po in sv

100% translated source file: 'po/en.po'
on the 'sv' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-23 21:08:20 +00:00
transifex-integration[bot]
330b9ab920
Translations for po/en.po in id (#2187)
Translate po/en.po in id

100% translated source file: 'po/en.po'
on the 'id' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-23 15:43:27 +02:00
transifex-integration[bot]
98d5352b78
Translations for po/en.po in ca (#2186)
Translate po/en.po in ca

100% translated source file: 'po/en.po'
on the 'ca' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-23 11:58:15 +02:00
Jo
27f336e68c
chore(deps): update deps (#2184)
update deps
2023-05-23 07:41:58 +00:00
transifex-integration[bot]
23937356eb
Translations for po/en.po in it_IT (#2183)
Translate po/en.po in it_IT

100% translated source file: 'po/en.po'
on the 'it_IT' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-23 09:22:10 +02:00
Jo
9641d2a608
chore(yay): add warning for yay -c (#2181)
add warning for yay -c
2023-05-22 22:02:14 +00:00
Jo
df3dbfa125
chore(yay): update po (#2180)
* update po

* avoid extra translate work
2023-05-22 21:52:06 +00:00
Jo
fdcf6ef664
ci(yay): add integration test framework (#2178)
* add integration test framework

* fix integration tests

* fix integration tests
2023-05-22 21:38:02 +00:00
Jo
39f0d4e9a1
chore(yay): remove unused constant (#2179)
remove unused constant

Co-authored-by: christian-heusel <christian@heusel.eu>
2023-05-22 20:59:30 +00:00
Andrew Geng
d33bf8841d
Fix yay -Sc wiping ~/.cache/yay on 3rd question. (#2175)
If you answer yes to
    :: Do you want to remove all other AUR packages from cache? [Y/n]
then we run cleanAUR(), intending to remove subdirectories of
~/.cache/yay that do not share a name with installed packages not
found in the sync repositories.

Where this was going wrong was cleanAUR() was getting an empty map from
dbExecutor.InstalledRemotePackages()---because InstalledRemotePackages
only recomputes its result if installedRemotePkgMap is nil, whereas
NewExecutor initialized it to an empty map. The symptom was it emptied
my ~/.cache/yay.

We do want a non-nil, empty installedRemotePkgMap to block recomputing
(that is, to indicate the user really has no remote packages), so now
NewExecutor initializes it to nil, and getPackageNamesBySource is
responsible for making sure it's non-nil before writing to it.

Fixes #2152, which seems to have been introduced in
4626a0409c4d34bcffe0d5ed499ebff893115c69.
2023-05-22 20:34:51 +00:00
Jo
bd79057fd9
feat(pkgbuild): use gitlab git repo for retrieving PKGBUILDs (#2177)
use gitlab git repo for retrieving PKGBUILDs
2023-05-22 20:22:49 +00:00
smolx
a0a5e45fe7
Fix -Qu exit code for empty update lists (#2162)
Fix -Qu exit code for empty update lists (#2061)

Previously, -Qun and -Qum without available updates could exit
with code 0 in some cases.
Also fix present and add more tests for such cases.
2023-05-22 18:35:27 +00:00
smolx
ec15a5b363
Implement tests for -Qu/q (#2156)
Implement tests for -Qu/q (#2061)
2023-05-15 10:33:12 +02:00
Jo
d568a73ab8
fix(metadata): reduce cache validity (#2161)
fix metadata cache validity too long
2023-05-15 08:30:41 +00:00
Joaquim Monteiro
490ebe4f7f
Fix formatting of error message that occurs on AUR errors (#2154)
fix: fix formatting of error message that occurs on AUR errors
2023-05-08 21:43:50 +00:00
transifex-integration[bot]
4dfee1f82f
Translate 'po/en.po' in 'fr_FR' (#2150)
Translate po/en.po in fr_FR

100% translated for the source file 'po/en.po'
on the 'fr_FR' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-04 16:14:26 +00:00
transifex-integration[bot]
5cf215a00a
Translate 'po/en.po' in 'es' (#2151)
Translate po/en.po in es

100% translated for the source file 'po/en.po'
on the 'es' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-05-04 16:14:15 +00:00
Jo
f7731d7cf9
fix(groups): fix sync group getting passed to reason setter (#2148)
fix sync group install #2137
2023-05-01 16:21:40 +00:00
Jo
966bfb74ee
feat(dep): add provider resolution on first layer (#2147)
add provider resolution on first layer
2023-05-01 15:56:42 +00:00
Jo
c721fe7f3b
fix(search): fix -Si crash when using metadata engine (#2146)
fix missing By query in metadata search
2023-05-01 12:52:43 +00:00
Jo
b51c10ca3e
doc(yay): specify search narrowing in man (#2145)
specify search narrowing in man
2023-05-01 12:33:19 +00:00
Tanmay Chaudhry
4832ec59db
Fix version diff word detection (#2124)
* Fix version diff word detection

* remove double negation

* preserve text color after tests
2023-04-27 07:23:25 +00:00
Jo
49267b9cd9
feat(upgrade): separate menu for pulled along dependencies (#2141)
try separate menu for pulled along

use installed as term

fix order gap

fix tests

add aur db + aur scenario
2023-04-27 07:20:21 +00:00
Jo
e6344100e6
fix(upgrade): fix local pulled dependencies replacing upgrade information (#2140)
avoid upgrade info loss

fix colormap
2023-04-27 07:04:09 +00:00
Tanmay Chaudhry
15400c5fc5
Switch the TZ to UTC temporarily for running the News tests (#2121)
* switch the TZ to UTC temporarily for running the tests

* Remove unnecessary environment reset

* Set test specific environment.

Co-authored-by: Jo <me@jguer.space>

---------

Co-authored-by: Jo <me@jguer.space>
2023-04-26 13:54:37 +02:00
transifex-integration[bot]
822b11b4d6
Translate 'po/en.po' in 'pt' (#2133)
Translate po/en.po in pt

100% translated for the source file 'po/en.po'
on the 'pt' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-26 13:53:54 +02:00
transifex-integration[bot]
71432a447e
Translate 'po/en.po' in 'cs' (#2130)
Translate po/en.po in cs

100% translated for the source file 'po/en.po'
on the 'cs' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-26 13:53:29 +02:00
transifex-integration[bot]
957292a911
Translate 'po/en.po' in 'zh_TW' (#2138)
Translate po/en.po in zh_TW

100% translated for the source file 'po/en.po'
on the 'zh_TW' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-26 13:46:06 +02:00
transifex-integration[bot]
8c69356bd4
Translate 'po/en.po' in 'it_IT' (#2126)
Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-20 08:58:19 +02:00
transifex-integration[bot]
64bb346fbe
Translate 'po/en.po' in 'de' (#2119)
* Translate po/en.po in de

100% translated for the source file 'po/en.po'
on the 'de' language.

* Translate po/en.po in de

100% translated for the source file 'po/en.po'
on the 'de' language.

* Translate po/en.po in de

100% translated for the source file 'po/en.po'
on the 'de' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-16 16:23:42 +00:00
transifex-integration[bot]
797b7485ab
Translate 'po/en.po' in 'sv' (#2118)
* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-16 16:23:27 +00:00
Jo
57a3a090f1
fix(vcs): improve timeout handling for vcs upgrade check (#2120)
remove timeout from full operation and increase timeout for single checking
2023-04-16 16:23:10 +00:00
transifex-integration[bot]
f0bfe63ced
Translate 'po/en.po' in 'ko' (#2116)
* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

* Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-13 06:50:20 +00:00
Jo
83214fbc1c
chore(yay): add local newer to AUR warnings (#2113) 2023-04-11 21:15:47 +00:00
Jo
161fede450
chore(yay): use new aur client for -Si (#2112)
* chore(yay): use new aur client for -Si

* chore(yay): use new client for clean
2023-04-11 21:15:21 +00:00
Jo
26c9ab5a87
fix(yay): fix panic -Si when package is missing (#2111)
fix panic -Si
2023-04-11 16:45:20 +00:00
Jo
4a9c736e2a
chore(upgrade): add makedep explain to the upgrade menu (#2110)
* display required by

* cutoff at 2
2023-04-11 16:41:34 +00:00
adasauce
76e5ee1fa6
fix: querybuilder GetTargets start iteration at 1, when include is 0 causes a crash (#2109) 2023-04-11 16:23:19 +00:00
Jo
88008e4eb3
feat(yay): skip confirmed confirms (#2107)
* skip pacman confirmations when yay confirmations are done

* default to double confirm

* fix tests
2023-04-11 11:51:39 +00:00
Jo
c7a51a1614
fix(text): ensure error logs go to stderr (#2105)
ensure Error logs go to stderr
2023-04-10 16:26:09 +00:00
transifex-integration[bot]
6a971df635
Translate 'po/en.po' in 'ca' (#2100)
* Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

* Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

* Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

* Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

* Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-10 15:47:31 +00:00
Jo
511b95769e
docs(yay): correct order of editor var evaluation (#2104)
correct documentation on order of editor preference. Use logger for editfn
2023-04-10 15:46:46 +00:00
transifex-integration[bot]
4e9a865388
Translate 'po/en.po' in 'id' (#2092)
Translate po/en.po in id

100% translated for the source file 'po/en.po'
on the 'id' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-08 07:58:54 +00:00
transifex-integration[bot]
9270f00c7e
Translate 'po/en.po' in 'pt' (#2093)
Translate po/en.po in pt

100% translated for the source file 'po/en.po'
on the 'pt' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-08 07:58:37 +00:00
transifex-integration[bot]
527c3a5058
Translate 'po/en.po' in 'tr' (#2089)
* Translate po/en.po in tr

100% translated for the source file 'po/en.po'
on the 'tr' language.

* Translate po/en.po in tr

100% translated for the source file 'po/en.po'
on the 'tr' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-06 20:57:20 +00:00
Jo
1ee94f28d3
fix(new_engine): add missing warnings to AUR updates (#2087)
* add missing warnings to AUR updates

* fix tests
2023-04-06 17:06:46 +00:00
Jo
a64180464b
chore(man): fix wording on operation order to make it more clear (#2088)
fix wording on operation order to make it more clear
2023-04-06 16:49:18 +00:00
transifex-integration[bot]
a31ca0d7dc
Translate 'po/en.po' in 'it_IT' (#2084)
* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

* Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-06 16:32:53 +00:00
transifex-integration[bot]
1bc3171abd
Translate 'po/en.po' in 'sv' (#2075)
* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

* Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

---------

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-06 16:32:45 +00:00
Jo
2d7297ae6d
fix(upgrades): don't include makedeps into -Qu/a (#2085)
don't include makedeps into -Qu/a
2023-04-06 16:32:22 +00:00
Jo
ce0cb35510
feat(new_engine): add support for --nocombinedupgrade (#2083)
* feat(new_engine): add support for --nocombinedupgrade

* update en.po
2023-04-06 11:54:03 +00:00
transifex-integration[bot]
d9b57790fa
Translate 'po/en.po' in 'pt' (#2081)
Translate po/en.po in pt

100% translated for the source file 'po/en.po'
on the 'pt' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-06 11:45:53 +00:00
Jo
5a3f277574
chore(yay): make warning less scary (#2074) 2023-04-05 11:53:11 +00:00
Jo
d1c012085c
fix(yay): ensure yay -Qu & -Quq don't get stuck looking for deps (#2072) 2023-04-05 11:43:30 +00:00
Jo
89f47f8ebe
fix(yay): correct operation order for preparer (#2071)
* fix(yay): reset to origin/HEAD for clean_menu

* fix(yay): correct operation order for preparer

* test(yay): fix flaky test
2023-04-05 11:27:42 +00:00
transifex-integration[bot]
5b5617c7e7
Translate 'po/en.po' in 'it_IT' (#2065)
Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-05 10:36:52 +02:00
transifex-integration[bot]
b80ef15add
Translate 'po/en.po' in 'sv' (#2063)
Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-05 10:36:42 +02:00
transifex-integration[bot]
e0fbb4495a
Translate 'po/en.po' in 'id' (#2068)
Translate po/en.po in id

100% translated for the source file 'po/en.po'
on the 'id' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-04-05 10:36:35 +02:00
jguer
5c7f9ba159
update en.po 2023-04-04 00:14:39 +02:00
Jo
0892fc7cdd
fix(query): reduce verbosity of -Qu (#2060) 2023-04-03 22:05:45 +00:00
jguer
c78d031b32
Revert "ci(yay): execute release-notary publish on ci (#2056)"
This reverts commit 6983a9ee7e760764df1fe7a25b3f5aa9ca3341dc.
2023-04-03 18:54:19 +02:00
Jo
6983a9ee7e
ci(yay): execute release-notary publish on ci (#2056) 2023-04-03 16:49:00 +00:00
Jo
ada8261bca
fix(yay): check if downgrade has smaller version (#2054) 2023-04-03 16:37:41 +00:00
Jo
3f09397816
Update bug_report.md (#2053) 2023-04-03 16:34:39 +00:00
Jo
9532e7b7da
fix(yay): show yay usage always on --help -h (#2051)
fix yay usage not showing
2023-04-03 13:50:49 +02:00
bittin
956c4cb100
Add bittin as translator (#2050) 2023-04-03 11:10:10 +00:00
Jo
16cce4384b
ci(lint): update builder image (#2048)
* last min ci changes

* readme update
2023-04-03 07:32:25 +00:00
Jo
c3888d9881
chore(ci): add missing translation files to release package (#2047)
add missing pos
2023-04-03 07:25:59 +00:00
Jo
c63576c36d
fix(search): always trust search if there's only one term (#2046)
always trust search if there's only one term
2023-04-03 07:16:03 +00:00
Jo
dd42593ba1
fix(search): only do exact trim in certain modes (#2045)
only do exact trim in certain modes
2023-04-02 17:50:35 +00:00
Jo
d13bdb0ce1
feat(search): improve exact match for separate source (#2044)
* unify query builder

* remove uneeded code

* reorganize code
2023-04-02 00:23:02 +00:00
Jo
6390d1c2b0
chore(query_builder): tweak mixed sources similarity matcher. related to #1719 (#2043)
tweak mixed sources similarity matcher. related to #1719
2023-04-01 13:30:50 +00:00
Jo
9028f5d8be
ci(release): update release notary (#2042)
update release notary
2023-04-01 10:48:46 +00:00
Jo
5d1c54413c
fix(query_builder): use correct aur client for mixed query builder (#2041)
* use same repo search as pacman

use logger child from runtime

* use common interface for aur clients
2023-04-01 10:33:05 +00:00
Jo
e615f8e07e
fix(new_engine): exclude menu removing new deps (#2040)
fix exclude menu removing new deps
2023-03-31 22:15:57 +00:00
Jo
d75e0a001d
fix(clean): modify clean args (#2039) 2023-03-31 21:22:57 +00:00
Ferdinand Bachmann
2bdbc3e06b
fix(aur_install): fix debug packages being added to deps even if not found (#2038) 2023-03-30 21:48:33 +00:00
Jo
01666aef37
fix(vcs): add extra context to errors and increase timeouts (#2037)
* give a more complete message on vcs error

* bump timeouts for vcs checking
2023-03-30 18:57:56 +00:00
transifex-integration[bot]
68337a58c1
Translate 'po/en.po' in 'it_IT' (#2035)
Translate po/en.po in it_IT

100% translated for the source file 'po/en.po'
on the 'it_IT' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-30 17:08:28 +00:00
transifex-integration[bot]
ca12cd7156
Translate 'po/en.po' in 'tr' (#1993)
Translate po/en.po in tr

100% translated for the source file 'po/en.po'
on the 'tr' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-29 11:47:47 +02:00
Ferdinand Bachmann
bebe80bb84
fix(aur_install): fix debug pacakges not being found (#1977) 2023-03-29 09:08:25 +00:00
transifex-integration[bot]
9137c1e95f
Translate 'po/en.po' in 'zh_CN' (#1978)
Translate po/en.po in zh_CN

100% translated for the source file 'po/en.po'
on the 'zh_CN' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-29 09:06:59 +00:00
transifex-integration[bot]
bdd888c59d
Translate 'po/en.po' in 'ko' (#1992)
Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-29 09:06:38 +00:00
transifex-integration[bot]
16b9516f96
Translate 'po/en.po' in 'fr_FR' (#1974)
Translate po/en.po in fr_FR

100% translated for the source file 'po/en.po'
on the 'fr_FR' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-15 09:34:10 +01:00
Jo
dfa7ed51c1
chore(yay): change found repo to debug (#1966)
change found to debug
2023-03-13 08:55:08 +00:00
Jo
7bc4a666e6
refactor(runtime): Build runtime after cmdargs parsing (#1965)
* extract runtime building from cfg

* respect AURRPCURL

* use -Syu if there are no targets, allows to pass extra options

* one more step towards removing runtime from cfg
2023-03-13 08:48:39 +00:00
transifex-integration[bot]
210512a5d6
Translate 'po/en.po' in 'ko' (#1963)
Translate po/en.po in ko

100% translated for the source file 'po/en.po'
on the 'ko' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-12 15:49:31 +00:00
transifex-integration[bot]
57250fec4b
Translate 'po/en.po' in 'sv' (#1962)
Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-12 15:49:23 +00:00
transifex-integration[bot]
e56b9cd72b
Translate 'po/en.po' in 'id' (#1960)
Translate po/en.po in id

100% translated for the source file 'po/en.po'
on the 'id' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-11 13:16:10 +00:00
transifex-integration[bot]
794a38fa28
Translate 'po/en.po' in 'pt' (#1959)
Translate po/en.po in pt

100% translated for the source file 'po/en.po'
on the 'pt' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-10 12:10:57 +01:00
Jo
f11f9058c2
fix(logger): respect --debug for setting debug logger (#1955)
respect --debug parse
2023-03-08 21:33:43 +00:00
Jo
7e7764a797
chore(new_engine): use new engine for -Qu (#1954)
* use new engine for -Qu

* fix devel search showing up in -Quq

* test empty upgrade menu
2023-03-08 21:30:54 +00:00
Jo
c744058b20
Update manual page and remove deprecate/removed options (#1951)
* add new options and remove deprecated

* add new -Bi
2023-03-08 13:07:15 +00:00
transifex-integration[bot]
7073939cdc
Translate 'po/en.po' in 'ca' (#1950)
Translate po/en.po in ca

100% translated for the source file 'po/en.po'
on the 'ca' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-08 10:47:21 +01:00
Jo
3d5a43c294
chore(yay): bump version (#1949)
bump major
2023-03-07 21:04:06 +00:00
transifex-integration[bot]
46bf36a160
Translate 'po/en.po' in 'sv' (#1948)
Translate po/en.po in sv

100% translated for the source file 'po/en.po'
on the 'sv' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-03-07 20:48:08 +00:00
Jo
463e60e045
chore(deps): update goalpm (#1947)
* update dependencies

* use logger in cmd runner
2023-03-06 09:19:21 +00:00
Jo
e6ed869df1
chore(yay): remove global cfg (#1946)
remove global cfg
2023-03-05 21:58:18 +00:00
Jo
8b8d6001a4
fix(new_engine): Improve partial upgrade protection and pinned deps (#1945)
* fix dep graph, existing in graph

* do not change from same dep reason

* roll up layer installs in case of fail

* re-use pacman exclude mechanism

should finish the reimplementation of the missing guards from the legacy
engine.

* include update in debug log

* test rollups
2023-03-05 17:31:11 +00:00
Joey H
0387dfdb59
fix(install): use global arguments when removing make dependencies (#1940)
fix(install): use global arguments when removing make dependencies

One example where this would fail before is when the `--root` argument
was passed in by the user. Yay would install the make dependencies to
the new root directory, but would try to remove them from / causing the
installation to fail if the make dependencies were not found in the
default installation path (/).
2023-02-26 11:25:02 +00:00
Jo
fa2e726ca6
chore(text): use logger in new engine services (#1939)
* use logger in vcs

* use logger in query builder

* use logger in migrations
2023-02-25 19:03:27 +00:00
Jo
841395c318
feat(local_install): check PKGBUILD and .SRCINFO presence and generate .SRCINFO if necessary (#1938)
check build file presence and generate if needed
2023-02-25 17:44:24 +00:00
Jo
4e0a5c8520
feat(new_engine): respect -w in AUR package building (#1923)
respect -w in AUR package building
2023-02-21 02:49:48 +00:00
Jo
61f1bdf291
fix(cmd): propagate sigterm to spawned processes (#1927)
propagate sigterm to spawned processes
2023-02-21 02:49:33 +00:00
Jo
3ef4664d99
fix(devel): timeout devel check after 5 secs (#1929)
timeout devel check after 5 secs
2023-02-21 02:48:56 +00:00
Jo
4780a974d9
fix flaky tests (#1928) 2023-02-21 02:42:56 +00:00
Jo
7c8f273cdf
fix(git): add git denylist for workspace variables (#1926)
add git denylist for workspace variables
2023-02-21 02:23:48 +00:00
Jo
a3d51a42da
fix(yay): match -Qu return (#1924)
match -Qu return
2023-02-21 02:00:39 +00:00
jguer
096ff7a544
fix -Si aur & repo mix 2023-02-21 02:55:49 +01:00
Jo
6c870db1f1
fix(aur): respect -uu for AUR downgrade (#1925)
respect -uu for AUR downgrade
2023-02-21 01:51:52 +00:00
Jo
f0433cc339
feat(metadata): respect regex in search (#1922)
respect regex in search
2023-02-21 01:24:48 +00:00
Jo
fad26c078d
fix(getpkgbuild): check AUR pkgs exist before GetPKGBUILD (#1921)
check AUR pkgs exist before GetPKGBUILD
2023-02-20 23:58:13 +00:00
Jo
7490836991
fix(new_engine): respect --needed on target gathering (fixes #1552) (#1920)
* use logger in dep graph

* use logger in dep graph

* use logger in dep graph

* only query for AUR packages once per tier. useful for rpc

* fix performance regression for ros-melodic

* prefer name search first

* implement needed at target gathering

* set default config

* fixup tests for needed
2023-02-20 23:14:59 +00:00
Jo
8d18f1be18
fix(new_engine): add aur client support to mixed source display (#1919)
* update aur client and use correct aur client

* add aur query client support for mixed source engine
2023-02-20 12:14:16 +00:00
Jo
f1d086df1d
Allow use of rpc client as an alternative to the metadata client (#1918)
* use updated aur client

* add logger to rpc client

* update go.mod
2023-02-20 11:20:48 +00:00
Jo
2f5fd5cb1c
fix(new_engine): add missing version for devel packages in combi (#1917)
* fix missing latest-commit in devel upgrade menu

* move test grapher to inner test
2023-02-20 09:51:39 +00:00
Jo
0a8bc1fe2e
fix(new_engine): add missing latest-commit in devel upgrade menu (#1916)
* fix missing latest-commit in devel upgrade menu

* move test grapher to inner test
2023-02-17 19:29:46 +00:00
Jo
0bf4c2e502
feat(new_install): show (#1915)
* show new packages in upgrade form if they exist

* refactor up select

* remove unused graph parts

* readd len

* Complete upgrade graphing

* Extract to upgrade pkg

* remove unused dep method

* remove uneeded dep

* cleanup method

* specify io Reader for testing

* use specified input vector

* fix non-active devel

* test base cases

* add devel test cases

* add range tests

* add logger struct

* use logger struct in upgrade

* follow golangci recommendations

* update deps

* update golangci
2023-02-17 19:01:26 +00:00
Jo
4f50b799ef
feat(local_install): add choice menu for yay -Bi (#1903)
add choice menu for yay -Bi
2023-01-23 23:54:15 +00:00
Jo
4626a0409c
fix(vcs): do not vcs update gather orphan info (#1902)
* reduce complexity of devel upgrade gathering

* clean orphans devel
2023-01-23 23:03:32 +00:00
Jo
1bfbd01f94
fix(sync): do not update vcs info of failed packages (#1901)
* extract srcinfo service to pkg

* take into account failed installs for vcs update. Fixes #1892

* fix tests
2023-01-23 21:43:58 +00:00
transifex-integration[bot]
04c4b0aa59
Translate '/po/en.po' in 'ru' (#1898)
Translate /po/en.po in ru

translation completed for the source file '/po/en.po'
on the 'ru' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-23 17:03:14 +01:00
Jo
9356481d1c
fix(config): expand tilde for some config fields. Fixes #1774 (#1897)
expande tilde for some fields. Fixes #1774
2023-01-22 23:21:05 +00:00
transifex-integration[bot]
2f1ebb9fde
Translate '/po/en.po' in 'tr' (#1888)
Translate /po/en.po in tr

translation completed for the source file '/po/en.po'
on the 'tr' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
Co-authored-by: jguer <me@jguer.space>
2023-01-19 10:51:43 +01:00
transifex-integration[bot]
5626ed3ff4
Translate '/po/en.po' in 'he' (#1895)
Translate /po/en.po in he

translation completed for the source file '/po/en.po'
on the 'he' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-18 11:46:45 +01:00
Balki
e09209bb19
fix(zsh): fix zsh completion for yay specific flags (#1893) 2023-01-13 13:50:09 +00:00
Jo
c40e949752
fix(build): fix issue where shell conditional output is passed to go build (#1887)
fix issue where shell conditional output is passed to go build

(cherry picked from commit 6744fa721f4737f695b40eac0571e3b4b1504b65)
2023-01-05 15:50:53 +00:00
Jo
b8debd1ae7
chore(yay): fix small linting issues (#1885)
* replace context.TODO() in tests context.Background()

* remove mock TODOs

* prettier

* apply missing linting
2023-01-03 21:43:56 +00:00
Joey H
8948278568
fix(parser): ensure data is piped when using '-' argument (#1881)
fix(parser): ensure data is piped when using '-' argument

Fixes #1626.

Check to see if data was actually piped to yay, otherwise it will hang
forever. This error will give users better feedback of how to use the
'-' argument (it is also the same exact error pacman throws for invalid
piped data).

Two tests were added. However, I didn't know how to add a test for the
actual part that detects whether data was piped or not (which is
essentially what this commit adds).
2023-01-03 19:43:15 +00:00
Jo
86bba8a289
fix(ci): run ci on all PRs (#1884)
* fix devel sysupgrade

* fix ci stuck on .po
2023-01-03 19:35:44 +00:00
transifex-integration[bot]
47b1428a25
Translate '/po/en.po' in 'fr_FR' (#1882)
Translate /po/en.po in fr_FR

translation completed for the source file '/po/en.po'
on the 'fr_FR' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-03 19:30:47 +00:00
Jo
b3c334e014
fix(new_engine): fix devel sysupgrade (#1883)
fix devel sysupgrade
2023-01-03 19:24:28 +00:00
transifex-integration[bot]
b41e67f31e
Translate '/po/en.po' in 'sv' [manual sync] (#1876)
Translate /po/en.po in sv

at least 90% translated for the source file '/po/en.po'
on the 'sv' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:06:30 +01:00
transifex-integration[bot]
13109992c5
Translate '/po/en.po' in 'zh_CN' [manual sync] (#1877)
Translate /po/en.po in zh_CN

at least 90% translated for the source file '/po/en.po'
on the 'zh_CN' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:04:56 +01:00
transifex-integration[bot]
ed95688c1b
Translate '/po/en.po' in 'zh_TW' [manual sync] (#1878)
Translate /po/en.po in zh_TW

at least 90% translated for the source file '/po/en.po'
on the 'zh_TW' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:04:32 +01:00
transifex-integration[bot]
b8b085599a
Translate '/po/en.po' in 'fr_FR' [manual sync] (#1879)
Translate /po/en.po in fr_FR

at least 90% translated for the source file '/po/en.po'
on the 'fr_FR' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:03:50 +01:00
transifex-integration[bot]
7a2db4f448
Translate '/po/en.po' in 'it_IT' [manual sync] (#1880)
Translate /po/en.po in it_IT

at least 90% translated for the source file '/po/en.po'
on the 'it_IT' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:02:54 +01:00
transifex-integration[bot]
40f058fd19
Translate '/po/en.po' in 'cs' [manual sync] (#1873)
Translate /po/en.po in cs

at least 90% translated for the source file '/po/en.po'
on the 'cs' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 19:00:45 +01:00
transifex-integration[bot]
f771424336
Translate '/po/en.po' in 'de' [manual sync] (#1874)
Translate /po/en.po in de

at least 90% translated for the source file '/po/en.po'
on the 'de' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 18:59:38 +01:00
transifex-integration[bot]
36282a8192
Translate '/po/en.po' in 'id' [manual sync] (#1875)
Translate /po/en.po in id

at least 90% translated for the source file '/po/en.po'
on the 'id' language.

 Manual sync of partially translated files: untranslated content is included with an empty translation or source language content depending on file format

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2023-01-02 18:58:51 +01:00
Jo
f372494d74
feat(search): add new bys and misc fixes (#1870)
* use default bin entry of gpg

* fix(dep): fix displayed information in chosen provide

* add new rpc bys to searchby

* man document
2022-12-30 19:02:32 +00:00
Jo
9be51052f7
update deps (#1868)
* update deps

* align struct
2022-12-29 18:42:43 +00:00
Jo
28d90c981e
feat(new engine): local install feature testing (#1867)
* make config into parameter

* test(new engine): local install test

* test(keys): fix test keys

* complete integration test for local install

* add simple mising mechanism
2022-12-29 12:34:53 +00:00
Joey H
d3fbfa26ca
fix(pre-commit): update URL and versions in the pre-commit configuration (#1865)
There was a dead link in the pre-commit-golang URL. Also, Woile/commitizen redirects to commitizen-tools/commitizen, so change that URL as well. Last, update all the versions using pre-commit autoupdate.

This will make it easier for new contributers to use the pre-commit
hooks when contributing.
2022-12-27 18:56:28 +00:00
Joey H
8c61bc9b45
fix(cmd): pass install flags into pacman in yogurt mode (#1864)
Fixes #1560
2022-12-27 18:55:42 +00:00
Joan Bruguera
2bda76e431
fix(clean): notify RemoveAll error when cleaning AUR (#1863)
otherwise if any hard-to-delete file lands in the AUR cache folder, running
`yay -Scc` will appear to succeed to the user, but actually abort midway.
2022-12-21 10:09:59 +01:00
Jo
9a23b792c4
Improve warning messages (#1857)
improve warning messages
2022-12-19 18:17:47 +00:00
Jo
4e3c664ab3
ci(yay) : attempt to use caches more effectively in CI (#1862)
* attempt to use caches more effectively

* fix cache key

* specify cache file
2022-12-18 20:42:22 +00:00
Jo
3eb9eb0d3d
feat(new_engine): add support for noDeps and noCheckDeps (#1861)
* feat(new engine): add support for noDeps and noCheckDeps

* test(new engine): Add tests for -dd and normal split package resolution
2022-12-18 20:14:41 +00:00
Jo
0b1ae938a3
fix(ci): do not install packages at pacman-git CI level (#1860)
* fix(ci): do not attempt to install extra packages for building pacman-git

* install needed in ci image

* fix missing pacman key

* wip

* add missing deps
2022-12-18 18:24:56 +00:00
Jo
f8e7891b0b
refactor(vcs): remove mux and use interface for other packages (#1859)
* refactor(vcs): remove context passing mutex from VCS interface

* simplify devel upgrade gather

* update vcs upgrade tests

* remove unused mock
2022-12-18 16:37:15 +00:00
Jo
4a3e365fc5
AUR provides search rpc (#1856)
* add support for provides search in AUR RPC client

* return provides search to default true
2022-12-18 02:36:49 +01:00
Jo
27cbbf4cb9
tests(new engine): add tests for layering and source handling (#1854)
* add tests for layering and source handling

* add tests for jellyfin split package
2022-12-18 02:00:39 +01:00
Jo
7da9f4869d
feat(new engine): skip built and respect --needed (#1852)
* add built package check

* respect --needed for new engine

* add needed check and test

* add test for not built
2022-12-16 17:23:44 +00:00
transifex-integration[bot]
c826456d4d
Translate '/po/en.po' in 'ko' (#1851)
Translate /po/en.po in ko

translation completed for the source file '/po/en.po'
on the 'ko' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2022-12-12 09:40:33 +00:00
transifex-integration[bot]
9df087bd3d
Translate '/po/en.po' in 'pt' (#1845)
Translate /po/en.po in pt

translation completed for the source file '/po/en.po'
on the 'pt' language.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2022-12-06 18:19:59 +01:00
Jo
1b5f0d66fe
Fix legacy engine install issues (#1842)
* fix debug pkg clearing pkg archive slice

* add debug messages and limit check for srcinfo

* treat pkgbuildDirs as expected directory, not proven directory

Co-authored-by: David Tomaschik <david@systemoverlord.com>

Co-authored-by: David Tomaschik <david@systemoverlord.com>
2022-11-29 13:23:52 +00:00
Jo
4f1f539217
Ignore last layer fails and keep building (#1841)
* fix: do not instantiate cleaning hooks if there's no AUR pkg

* chore: squash local and sync install

* still update completions in local mode

* allow failures on the last install layer
2022-11-29 12:25:36 +00:00
Jo
7612bb5da5
Mutualize local sync install (#1836)
* fix: do not instantiate cleaning hooks if there's no AUR pkg

* chore: squash local and sync install

* still update completions in local mode
2022-11-20 13:02:01 +00:00
Jo
9f67d10d5c
feat(v12): add group install (#1835)
v12engine: add group install
2022-11-20 02:47:23 +00:00
Jo
6ad63cae10
fix: rework menus to work on both flows (#1830)
* rework menus to work on both flows

* add installed package split

* remove unused field

* Add post install hooks
2022-11-20 00:51:55 +00:00
transifex-integration[bot]
63f20599cd
Translate '/po/en.po' in 'ca' (#1833)
Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2022-11-19 19:18:49 +00:00
Jo
0f45a99efa
yay v12: new AUR query engine, Local PKGBuild install, debug mode, new install engine + bugs (#1826)
yay v12: new AUR query engine, Local PKGBuild install, debug mode, new install engine + bugs (+bugs)
2022-11-16 01:32:26 +00:00
jguer
f52db80a9d
po update 2022-11-16 02:21:35 +01:00
jguer
7f151cd603
move addUpgradeToGraph to depgraph 2022-11-16 01:11:16 +01:00
jguer
2805252365
fix GraphFromTarget bug where empty DB would be used as identifier 2022-11-16 01:03:57 +01:00
jguer
481c63db91
remove local build from -U 2022-11-16 00:56:24 +01:00
jguer
5e8d9ac846
add make to dockerfile 2022-11-16 00:54:21 +01:00
jguer
085e2c8aea
rename -Ui to -Bi 2022-11-16 00:51:57 +01:00
jguer
1153ee6dbb
support multiple targets. Remove working directory logic 2022-11-16 00:45:59 +01:00
jguer
3f5eb36fe2
add gcc to build 2022-11-16 00:29:53 +01:00
jguer
2358a7f66e
use aur package metadata client 2022-11-16 00:25:17 +01:00
jguer
2560cadcc0
update aur package 2022-11-15 19:02:35 +01:00
jguer
c00cd8d88e
rename AUR metadata client 2022-11-15 16:22:57 +01:00
jguer
f042713aaa
fix: fix linting errors 2022-11-15 15:44:50 +01:00
jguer
ae918986f4
Merge remote-tracking branch 'origin/next' into jguer/local-pkgbuild-install 2022-11-15 15:19:30 +01:00
jguer
3f7f55f260
simplify src download 2022-11-14 01:14:13 +01:00
jguer
fd46fa0f33
fix failing errors 2022-11-14 00:25:14 +01:00
jguer
d7851223c6
correct nits from metadata package and remove uneeded functionality 2022-11-14 00:21:36 +01:00
jguer
5aeb0d696c
support contains 2022-11-13 23:53:37 +01:00
jguer
56a46644cc
metadata respect search by 2022-11-13 17:47:19 +01:00
jguer
01721c816c
refactor query builder to include AURClient 2022-11-13 14:29:00 +01:00
jguer
cc8c0a2366
fix non-prefixed version 2022-11-10 01:48:58 +01:00
jguer
6cbf00c5a7
typos 2022-11-08 01:37:54 +01:00
jguer
742b6ad79c
use aur cache for upgrades 2022-11-08 01:32:21 +01:00
jguer
9b576fbab7
update dependencies 2022-11-08 01:12:13 +01:00
jguer
bfb32ea63b
update dependencies 2022-11-08 01:11:57 +01:00
jguer
a724d1554f
check hook is not nil 2022-11-03 16:23:47 +01:00
jguer
3fc5d93243
add graph feeding 2022-11-02 00:37:27 +01:00
jguer
0b3ca79788
reduce scope 2022-11-01 23:51:24 +01:00
jguer
b5bdcfbd1a
add basic sync upgrade capabilities 2022-11-01 23:48:35 +01:00
jguer
776fc9686a
Merge remote-tracking branch 'origin/next' into jguer/local-pkgbuild-install 2022-10-28 23:58:23 +02:00
jguer
d3efb59da3
extract upgrade target adder 2022-10-28 23:58:15 +02:00
jguer
849e8f7b60
restore install support for legacy 2022-10-28 01:01:03 +02:00
jguer
ba935ccf95
add support for target install 2022-10-28 00:38:11 +02:00
jguer
f496dbac8b
error refactor 2022-09-20 00:44:06 +02:00
jguer
bc4732e9e1
add debug mode 2022-09-20 00:11:10 +02:00
jguer
e4fdc9a4d4
readd makedep primitives 2022-09-20 00:01:19 +02:00
jguer
c86c460816
fix typos 2022-09-17 14:45:07 +02:00
jguer
d646cd6c87
add base to name map 2022-09-17 14:31:54 +02:00
jguer
ed94152cfe
first install 2022-09-12 00:18:38 +02:00
jguer
915799755b
allow pkgdest specification 2022-09-12 00:10:19 +02:00
jguer
351e352f64
basic aur install 2022-09-11 23:15:31 +02:00
jguer
5bb46ac1de
add repo install 2022-09-09 20:57:18 +02:00
jguer
4a73dfb0ca
topo sorted map 2022-09-09 17:38:48 +02:00
jguer
be036c39ff
wip 2022-09-08 18:04:51 +02:00
jguer
d53505be37
allow customizing colors and embedding extra info 2022-09-07 01:03:29 +02:00
jguer
0aa18d61d5
cleanup 2022-09-07 00:01:23 +02:00
jguer
95e7542ade
Merge remote-tracking branch 'origin/next' into jguer/local-pkgbuild-install 2022-09-06 23:39:08 +02:00
jguer
cadeecc4df
add dep graph for local install 2022-09-06 23:38:47 +02:00
jguer
f7286b25ae
add local graph util 2022-09-06 23:25:44 +02:00
jguer
650809eba1
wip 2022-09-04 23:45:40 +02:00
jguer
1d2d19c323
Merge remote-tracking branch 'origin/next' into jguer/local-pkgbuild-install 2022-08-23 23:29:27 +02:00
jguer
a4565b367b
wip 2022-08-23 18:33:40 +02:00
jguer
cce21ce0b6
Merge remote-tracking branch 'origin/next' into jguer/local-pkgbuild-install 2022-08-23 17:22:10 +02:00
jguer
b054828aa8
wip 2022-08-22 23:28:53 +02:00
jguer
859b7c703f
add local install handle 2022-08-21 07:15:04 +02:00
jguer
446dc86d1e
add handle upgrade 2022-08-21 07:08:01 +02:00
199 changed files with 30067 additions and 12446 deletions

26
.devcontainer/Dockerfile Normal file
View File

@ -0,0 +1,26 @@
# Use the jguer/yay-builder image as a parent image with archlinux
FROM docker.io/jguer/yay-builder
# Install extra packages (pacman-contrib and fish)
RUN sudo pacman -Syu --noconfirm pacman-contrib fish git-delta openssh bat go
# Set passwordless sudo for the docker user
RUN echo "docker ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/docker
# Create a non-root user and switch to it
USER docker
# Install xgotext
RUN go install github.com/leonelquinteros/gotext/cli/xgotext@latest
# Add /app/bin to the PATH
ENV PATH="/app/bin:$PATH"
# add /home/docker/go/bin to the PATH
ENV PATH="/home/docker/go/bin:$PATH"
# Set the working directory
WORKDIR /workspace
# Command to run when starting the container
CMD ["bash"]

View File

@ -0,0 +1,14 @@
{
"name": "Existing Dockerfile",
"build": {
"context": "..",
"dockerfile": "../.devcontainer/Dockerfile"
},
"customizations": {
"vscode": {
"extensions": [
"golang.go"
]
}
}
}

View File

@ -32,6 +32,11 @@ Example: `yay v8.1139.r0.g9ac4ab6 - libalpm v11.0.1` -->
Include the FULL output of any relevant commands/configs
The current yay config can be printed with `yay -Pg`
Paste services are only needed for excessive output (>500 lines)
Use --debug to add pacman and yay debug logs
or add the following key to your ~/.config/yay/config.json to only get yay debug logs
{
"debug": true
}
-->
```sh

15
.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,15 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "gomod" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
groups:
go-all:
patterns:
- '*'

View File

@ -1,40 +1,143 @@
name: Builder image
name: Builder Image
on:
schedule:
- cron: "0 3 * * 1"
- cron: "0 3 * * 1" # Every Monday at 3 AM
push:
paths:
- "ci.Dockerfile"
- "**/builder-image.yml"
- ".github/workflows/builder-image.yml"
env:
REGISTRY_IMAGE: jguer/yay-builder
jobs:
build:
name: Push builder image to Docker Hub
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
platform:
- linux/amd64
- linux/arm/v7
- linux/arm64
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: |
${{ env.REGISTRY_IMAGE }}
ghcr.io/${{ env.REGISTRY_IMAGE }}
tags: |
type=raw,value=latest
type=sha,format=long
- name: Build and push by digest
id: build
uses: docker/build-push-action@v5
with:
context: .
file: ci.Dockerfile
platforms: ${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
echo -n "$digest" > "/tmp/digests/$(echo "${{ matrix.platform }}" | tr '/' '_')"
- name: Upload digest
uses: actions/upload-artifact@v4
with:
name: digest-${{ matrix.platform == 'linux/amd64' && 'amd64' || matrix.platform == 'linux/arm/v7' && 'armv7' || 'arm64' }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
merge:
needs: [build]
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Download digests
uses: actions/download-artifact@v4
with:
pattern: digest-*
merge-multiple: true
path: /tmp/digests
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
uses: docker/login-action@v1
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push to Docker Hub
uses: docker/build-push-action@v2
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: |
${{ env.REGISTRY_IMAGE }}
ghcr.io/${{ env.REGISTRY_IMAGE }}
tags: |
type=raw,value=latest
type=sha,format=short
- name: Create and push manifest list
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 0
with:
platforms: linux/amd64,linux/arm/v7,linux/arm64
file: ci.Dockerfile
push: true
tags: jguer/yay-builder:latest
secrets: |
DOCKER_BUILDKIT=0
COMPOSE_DOCKER_CLI_BUILD=0
cache-from: type=registry,ref=jguer/yay-builder:latest
cache-to: type=inline
DOCKER_CLI_EXPERIMENTAL: enabled
run: |
# Extract Docker Hub tags
DH_TAGS=$(echo '${{ steps.meta.outputs.tags }}' | grep -v "^ghcr.io" | xargs -I {} echo "-t {}")
# Extract GitHub Container Registry tags
GHCR_TAGS=$(echo '${{ steps.meta.outputs.tags }}' | grep "^ghcr.io" | xargs -I {} echo "-t {}")
# Create a manifest list using the image digests from /tmp/digests/*
DIGESTS=$(for file in /tmp/digests/*; do
echo -n "${{ env.REGISTRY_IMAGE }}@$(cat $file) "
done)
# Create the manifest list for Docker Hub
docker buildx imagetools create $DH_TAGS $DIGESTS
# Create the manifest list for GitHub Container Registry
docker buildx imagetools create $GHCR_TAGS $DIGESTS
- name: Inspect image
run: |
docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:latest

View File

@ -1,4 +1,5 @@
name: Build Release
on:
push:
tags:
@ -8,35 +9,36 @@ jobs:
build-releases:
strategy:
matrix:
arch:
[
"linux/amd64 x86_64",
"linux/arm/v7 armv7h",
"linux/arm64 aarch64",
]
arch: ["linux/amd64 x86_64", "linux/arm/v7 armv7h", "linux/arm64 aarch64"]
name: Build ${{ matrix.arch }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
with:
platforms: all
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
version: latest
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Read info
id: tags
shell: bash
run: |
echo ::set-output name=VERSION::${GITHUB_REF/refs\/tags\/v/}
echo ::set-output name=TAG::${GITHUB_REF/refs\/tags\//}
echo "VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
echo "TAG=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
arch="${{ matrix.arch }}"
echo ::set-output name=PLATFORM::${arch%% *}
echo ::set-output name=ARCH::${arch##* }
echo "PLATFORM=${arch%% *}" >> $GITHUB_OUTPUT
echo "ARCH=${arch##* }" >> $GITHUB_OUTPUT
- name: Build ${{ matrix.arch }} release
run: |
mkdir artifacts
@ -47,75 +49,45 @@ jobs:
-t yay:${{ steps.tags.outputs.arch }} . --load
make docker-release ARCH=${{ steps.tags.outputs.arch }} VERSION=${{ steps.tags.outputs.version }} PREFIX="/usr"
mv *.tar.gz artifacts
- uses: actions/upload-artifact@v2
- uses: actions/upload-artifact@v4
with:
name: yay_${{ steps.tags.outputs.arch }}
path: artifacts
create_release:
name: Create release from this build
needs: [build-releases]
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Read info
id: tags
shell: bash
run: |
echo ::set-output name=VERSION::${GITHUB_REF/refs\/tags\/v/}
echo ::set-output name=TAG::${GITHUB_REF/refs\/tags\//}
- uses: actions/download-artifact@v2
echo "VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
echo "TAG=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
- uses: actions/download-artifact@v4
with:
name: yay_x86_64
- uses: actions/download-artifact@v2
with:
name: yay_armv7h
- uses: actions/download-artifact@v2
with:
name: yay_aarch64
pattern: yay_*
merge-multiple: true
- name: Create Release
id: create_release
uses: actions/create-release@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ steps.tags.outputs.tag }}
release_name: ${{ steps.tags.outputs.tag }}
draft: false
prerelease: false
- name: Upload x86_64 asset
id: upload-release-asset-x86_64
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./yay_${{ steps.tags.outputs.version }}_x86_64.tar.gz
asset_name: yay_${{ steps.tags.outputs.version }}_x86_64.tar.gz
asset_content_type: application/tar+gzip
- name: Upload armv7h asset
id: upload-release-asset-armv7h
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./yay_${{ steps.tags.outputs.version }}_armv7h.tar.gz
asset_name: yay_${{ steps.tags.outputs.version }}_armv7h.tar.gz
asset_content_type: application/tar+gzip
- name: Upload aarch64 asset
id: upload-release-asset-aarch64
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./yay_${{ steps.tags.outputs.version }}_aarch64.tar.gz
asset_name: yay_${{ steps.tags.outputs.version }}_aarch64.tar.gz
asset_content_type: application/tar+gzip
run: |
gh release create ${{ steps.tags.outputs.tag }} \
--title "${{ steps.tags.outputs.tag }}" \
--generate-notes \
./yay_${{ steps.tags.outputs.version }}_*.tar.gz
- name: Release Notary Action
uses: docker://aevea/release-notary:0.9.3
uses: docker://aevea/release-notary:latest
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -9,27 +9,31 @@ on:
jobs:
build:
name: Lint and test yay
name: Lint and test yay (-git)
runs-on: ubuntu-latest
container:
image: jguer/yay-builder:latest
image: ghcr.io/jguer/yay-builder:latest
steps:
- name: Checkout
uses: actions/checkout@v2
- uses: actions/cache@v1
- uses: actions/checkout@v4
- uses: actions/cache@v3
with:
path: ~/go/pkg/mod
key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}
restore-keys: |
${{ runner.os }}-go-
- uses: actions/cache@v3
with:
path: /home/runner/work/yay/yay/pacman-git
key: ${{ runner.os }}-pacman-${{ hashFiles('/home/runner/work/yay/yay/pacman-git/PKGBUILD') }}
restore-keys: |
${{ runner.os }}-pacman-
- name: checkout pacman-git
run: |
pacman -Sy --overwrite=* --noconfirm archlinux-keyring
pacman -Su --overwrite=* --noconfirm sudo base-devel
git clone https://aur.archlinux.org/pacman-git
git -C ./pacman-git pull || git clone https://aur.archlinux.org/pacman-git
useradd github
chmod -R 777 pacman-git
echo 'github ALL=(ALL) NOPASSWD: ALL' >> /etc/sudoers
su github -c 'cd pacman-git; yes | makepkg -si --nocheck'
chmod -R 777 pacman-git
su github -c 'cd pacman-git; yes | makepkg -i --nocheck'
- name: Run Build and Tests with pacman-git
run: make test
run: |
make test

View File

@ -1,28 +1,44 @@
name: Test against pacman
on:
pull_request:
paths-ignore:
- "doc/**"
- "**/*.po"
- "README.md"
- ".gitignore"
jobs:
build:
name: Lint and test yay
runs-on: ubuntu-latest
container:
image: jguer/yay-builder:latest
image: ghcr.io/jguer/yay-builder:latest
steps:
- name: Checkout
uses: actions/checkout@v2
- uses: actions/cache@v1
- uses: actions/checkout@v4
- uses: actions/cache@v3
with:
path: ~/go/pkg/mod
key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}
restore-keys: |
${{ runner.os }}-go-
- name: Lint
run: /app/bin/golangci-lint run ./...
env:
GOFLAGS: -buildvcs=false -tags=next
run: /app/bin/golangci-lint run -v ./...
- name: Run Build and Tests
run: make test
run: make test
- name: Run Integration Tests
continue-on-error: true
run: |
useradd -m yay &&
chown -R yay:yay . &&
cp -r ~/go/ /home/yay/go/ &&
chown -R yay:yay /home/yay/go/ &&
su yay -c "make test-integration"
- name: Build yay Artifact
env:
GOFLAGS: -buildvcs=false -tags=next
run: make
- name: Upload yay Artifact
uses: actions/upload-artifact@v4
with:
name: yay
path: ./yay
if-no-files-found: error
overwrite: true

4
.gitignore vendored
View File

@ -28,3 +28,7 @@ qemu-*
*.pot
*.po~
*.pprof
node_modules/
xgotext
.devcontainer/

View File

@ -1,100 +1,94 @@
linters-settings:
dupl:
threshold: 100
funlen:
lines: 100
statements: 50
goconst:
min-len: 3
min-occurrences: 4
gocritic:
enabled-tags:
- diagnostic
- experimental
- opinionated
- performance
- style
gocyclo:
min-complexity: 15
goimports:
local-prefixes: github.com/Jguer/yay/v11
gomnd:
settings:
mnd:
# don't include the "operation" and "assign"
checks: argument,case,condition,return
govet:
check-shadowing: true
lll:
line-length: 140
maligned:
suggest-new: true
misspell:
locale: US
version: "2"
run:
go: "1.20"
linters:
# please, do not use `enable-all`: it's deprecated and will be removed soon.
# inverted configuration with `enable-all` and `disable` is not scalable during updates of golangci-lint
disable-all: true
default: none
enable:
- bodyclose
- usestdlibvars
- deadcode
- depguard
- dogsled
- dupl
- errcheck
- errorlint
- gochecknoinits
- gocritic
- gofmt
- goimports
- goprintffuncname
- gosec
- gosimple
- govet
- ineffassign
- lll
- misspell
- nakedret
- prealloc
- revive
- rowserrcheck
- noctx
- nolintlint
- staticcheck
- structcheck
- stylecheck
- typecheck
- unconvert
- unparam
- unused
- tenv
- varcheck
- whitespace
- wsl
- godot
# - maligned
# - interfacer
# - nilerr
# - nlreturn
# - exhaustivestruct
# - errname
# - forbidigo
run:
go: '1.18'
issues:
exclude-rules:
- path: _test\.go
linters:
- lll
- revive
- wsl
- govet
- godot
- errcheck
- stylecheck
- dupl
- gocritic
- gochecknoinits
exclude:
- G204
settings:
dupl:
threshold: 100
funlen:
lines: 100
statements: 50
goconst:
min-len: 3
min-occurrences: 4
gocritic:
enabled-tags:
- diagnostic
- experimental
- opinionated
- performance
- style
gocyclo:
min-complexity: 15
lll:
line-length: 140
misspell:
locale: US
nolintlint:
require-explanation: false
require-specific: false
allow-unused: false
exclusions:
generated: lax
presets:
- comments
- common-false-positives
- legacy
- std-error-handling
rules:
- linters:
- dupl
- errcheck
- errorlint
- gochecknoinits
- gocritic
- godot
- govet
- lll
- revive
- staticcheck
- wsl
path: (.+)_test.go
- path: (.+)\.go$
text: G204
paths:
- third_party$
- builtin$
- examples$
formatters:
enable:
- gofmt
- goimports
settings:
goimports:
local-prefixes:
- github.com/Jguer/yay/v12
exclusions:
generated: lax
paths:
- third_party$
- builtin$
- examples$

View File

@ -1,31 +1,29 @@
default_stages: [commit]
repos:
- repo: git://github.com/dnephin/pre-commit-golang
rev: v0.3.5
- repo: https://github.com/dnephin/pre-commit-golang
rev: v0.5.1
hooks:
- id: go-fmt
- id: go-imports
args: [-local=github.com/Jguer/yay/v11/]
- id: golangci-lint
- id: go-unit-tests
- id: go-build
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v2.2.1 # Use the sha or tag you want to point at
rev: v4.0.0-alpha.8 # Use the sha or tag you want to point at
hooks:
- id: prettier
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0 # Use the ref you want to point at
rev: v4.5.0 # Use the ref you want to point at
hooks:
- id: trailing-whitespace
- id: check-json
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/Woile/commitizen
rev: v2.17.6
- repo: https://github.com/commitizen-tools/commitizen
rev: v3.15.0
hooks:
- id: commitizen
stages: [commit-msg]

7
.vscode/settings.json vendored Normal file
View File

@ -0,0 +1,7 @@
{
"go.lintTool": "golangci-lint",
"gopls": {
"formatting.gofumpt": true,
"formatting.local": "github.com/Jguer/yay/v12"
}
}

View File

@ -1,5 +1,5 @@
FROM jguer/yay-builder:latest
LABEL maintainer="Jguer,joaogg3 at google mail"
FROM ghcr.io/jguer/yay-builder:latest
LABEL maintainer="Jguer,docker@jguer.space"
ARG VERSION
ARG PREFIX
@ -9,4 +9,4 @@ WORKDIR /app
COPY . .
RUN make release VERSION=${VERSION} PREFIX=${PREFIX} ARCH=${ARCH}
RUN make release VERSION=${VERSION} PREFIX=${PREFIX} ARCH=${ARCH}

View File

@ -10,8 +10,8 @@ GO ?= go
PKGNAME := yay
PREFIX := /usr/local
MAJORVERSION := 11
MINORVERSION := 3
MAJORVERSION := 12
MINORVERSION := 0
PATCHVERSION := 0
VERSION ?= ${MAJORVERSION}.${MINORVERSION}.${PATCHVERSION}
@ -19,15 +19,14 @@ LOCALEDIR := po
SYSTEMLOCALEPATH := $(PREFIX)/share/locale/
# ls -1 po | sed -e 's/\.po$//' | paste -sd " "
LANGS := de en es eu fr_FR id it_IT ja ko pl_PL pt pt_BR ru_RU sv tr uk zh_CN zh_TW
LANGS := ca cs de en es eu fr_FR he id it_IT ja ko pl_PL pt_BR pt ru_RU ru sv tr uk zh_CN zh_TW
POTFILE := default.pot
POFILES := $(addprefix $(LOCALEDIR)/,$(addsuffix .po,$(LANGS)))
MOFILES := $(POFILES:.po=.mo)
FLAGS ?= -trimpath -mod=readonly -modcacherw
EXTRA_FLAGS ?= -buildmode=pie
LDFLAGS := -X "main.yayVersion=${VERSION}" -X "main.localePath=${SYSTEMLOCALEPATH}" -linkmode=external
FLAGS += $(shell pacman -T 'pacman-git' && echo "-tags next")
LDFLAGS := -X "main.yayVersion=${VERSION}" -X "main.localePath=${SYSTEMLOCALEPATH}" -linkmode=external -compressdwarf=false
RELEASE_DIR := ${PKGNAME}_${VERSION}_${ARCH}
PACKAGE := $(RELEASE_DIR).tar.gz
@ -53,6 +52,10 @@ test_lint: test lint
test:
$(GO) test -race -covermode=atomic $(FLAGS) ./...
.PHONY: test-integration
test-integration:
$(GO) test -tags=integration $(FLAGS) ./...
.PHONY: build
build: $(BIN)
@ -66,7 +69,7 @@ docker-release-all:
make docker-release-aarch64 ARCH=aarch64
docker-release:
docker create --name yay-$(ARCH) yay:${ARCH}
docker create --name yay-$(ARCH) yay:${ARCH} /bin/sh
docker cp yay-$(ARCH):/app/${PACKAGE} $(PACKAGE)
docker container rm yay-$(ARCH)
@ -79,9 +82,7 @@ docker-build:
.PHONY: lint
lint:
$(GO) vet $(FLAGS) ./...
@test -z "$$(gofmt -l $(SOURCES))" || (echo "Files need to be linted. Use make fmt" && false)
golangci-lint run ./...
GOFLAGS="$(FLAGS)" golangci-lint run ./...
.PHONY: fmt
fmt:
@ -122,8 +123,9 @@ $(PACKAGE): $(BIN) $(RELEASE_DIR) ${MOFILES}
locale:
xgotext -in . -out po
mv po/default.pot po/en.po
for lang in ${LANGS}; do \
test -f po/$$lang.po || msginit -l po/$$lang.po -i po/${POTFILE} -o po/$$lang.po \
test -f po/$$lang.po || msginit --no-translator -l po/$$lang.po -i po/${POTFILE} -o po/$$lang.po; \
msgmerge -U po/$$lang.po po/${POTFILE}; \
touch po/$$lang.po; \
done

128
README.md
View File

@ -19,6 +19,7 @@ Yet Another Yogurt - An AUR Helper Written in Go
- Narrow search (`yay linux header` will first search `linux` and then narrow on `header`)
- Find matching package providers during search and allow selection
- Remove make dependencies at the end of the build process
- Build local PKGBUILDs with AUR dependencies
- Un/Vote for packages
[![asciicast](https://asciinema.org/a/399431.svg)](https://asciinema.org/a/399431)
@ -29,15 +30,18 @@ Yet Another Yogurt - An AUR Helper Written in Go
If you are migrating from another AUR helper, you can simply install Yay with that helper.
> [!WARNING]
> We are using `sudo` in these examples, you can switch that out for a different privilege escalation tool.
### Source
The initial installation of Yay can be done by cloning the PKGBUILD and
building with makepkg:
Before you begin, make sure you have the `base-devel` package group installed.
We make sure we have the `base-devel` package group installed.
```sh
pacman -S --needed git base-devel
sudo pacman -S --needed git base-devel
git clone https://aur.archlinux.org/yay.git
cd yay
makepkg -si
@ -46,22 +50,27 @@ makepkg -si
If you want to do all of this at once, we can chain the commands like so:
```sh
pacman -S --needed git base-devel && git clone https://aur.archlinux.org/yay.git && cd yay && makepkg -si
sudo pacman -S --needed git base-devel && git clone https://aur.archlinux.org/yay.git && cd yay && makepkg -si
```
### Binary
If you do not want to compile yay yourself you can use the builds generated by
GitHub Actions.
```sh
pacman -S --needed git base-devel
sudo pacman -S --needed git base-devel
git clone https://aur.archlinux.org/yay-bin.git
cd yay-bin
makepkg -si
```
If you want to do all of this at once, we can chain the commands like so:
```sh
sudo pacman -S --needed git base-devel && git clone https://aur.archlinux.org/yay-bin.git && cd yay-bin && makepkg -si
```
### Other distributions
If you're using Manjaro or [another distribution that packages `yay`](https://repology.org/project/yay/versions)
@ -70,8 +79,8 @@ you can simply install yay using pacman (as root):
```sh
pacman -S --needed git base-devel yay
```
⚠️ distributions sometimes lag updating yay on their repositories.
> [!WARNING]
> distributions sometimes lag updating yay on their repositories.
## First Use
@ -88,90 +97,65 @@ pacman -S --needed git base-devel yay
## Examples of Custom Operations
| Command | Description |
| --------------------------------- | --------------------------------------------------------------------------------------------------- |
| `yay` | Alias to `yay -Syu`. |
| `yay <Search Term>` | Present package-installation selection menu. |
| `yay -Y --combinedupgrade --save` | Make combined upgrade the default mode. |
| `yay -Ps` | Print system statistics. |
| `yay -Yc` | Clean unneeded dependencies. |
| `yay -G <AUR Package>` | Download PKGBUILD from ABS or AUR. |
| `yay -Gp <AUR Package>` | Print to stdout PKGBUILD from ABS or AUR. |
| `yay -Y --gendb` | Generate development package database used for devel update. |
| `yay -Syu --devel` | Perform system upgrade, but also check for development package updates. |
| `yay -Syu --timeupdate` | Perform system upgrade and use PKGBUILD modification time (not version number) to determine update. |
| `yay -Wv <AUR Package>` | Vote for package (Requires setting `AUR_USERNAME` and `AUR_PASSWORD` environment variables). (yay v11.3+) |
| `yay -Wu <AUR Package>` | Unvote for package (Requires setting `AUR_USERNAME` and `AUR_PASSWORD` environment variables) (yay v11.3+)|
| Command | Description |
| --------------------------------- | ---------------------------------------------------------------------------------------------------------- |
| `yay` | Alias to `yay -Syu`. |
| `yay <Search Term>` | Present package-installation selection menu. |
| `yay -Bi <dir>` | Install dependencies and build a local PKGBUILD. |
| `yay -G <AUR Package>` | Download PKGBUILD from ABS or AUR. (yay v12.0+) |
| `yay -Gp <AUR Package>` | Print to stdout PKGBUILD from ABS or AUR. |
| `yay -Ps` | Print system statistics. |
| `yay -Syu --devel` | Perform system upgrade, but also check for development package updates. |
| `yay -Syu --timeupdate` | Perform system upgrade and use PKGBUILD modification time (not version number) to determine update. |
| `yay -Wu <AUR Package>` | Unvote for package (Requires setting `AUR_USERNAME` and `AUR_PASSWORD` environment variables) (yay v11.3+) |
| `yay -Wv <AUR Package>` | Vote for package (Requires setting `AUR_USERNAME` and `AUR_PASSWORD` environment variables). (yay v11.3+) |
| `yay -Y --combinedupgrade --save` | Make combined upgrade the default mode. |
| `yay -Y --gendb` | Generate development package database used for devel update. |
| `yay -Yc` | Clean unneeded dependencies. |
## Frequently Asked Questions
- **Yay does not display colored output. How do I fix it?**
Make sure you have the `Color` option in your `/etc/pacman.conf`
(see issue [#123](https://github.com/Jguer/yay/issues/123)).
- **Yay is not prompting to skip packages during system upgrade.**
The default behavior was changed after
[v8.918](https://github.com/Jguer/yay/releases/tag/v8.918)
(see [3bdb534](https://github.com/Jguer/yay/commit/3bdb5343218d99d40f8a449b887348611f6bdbfc)
and issue [#554](https://github.com/Jguer/yay/issues/554)).
To restore the package-skip behavior use `--combinedupgrade` (make
it permanent by appending `--save`). Note: skipping packages will leave your
system in a
[partially-upgraded state](https://wiki.archlinux.org/index.php/System_maintenance#Partial_upgrades_are_unsupported).
Make sure you have the `Color` option in your `/etc/pacman.conf`
(see issue [#123](https://github.com/Jguer/yay/issues/123)).
- **Sometimes diffs are printed to the terminal, and other times they are paged via less. How do I fix this?**
Yay uses `git diff` to display diffs, which by default tells less not to
page if the output can fit into one terminal length. This behavior can be
overridden by exporting your own flags (`export LESS=SRX`).
Yay uses `git diff` to display diffs, which by default tells less not to
page if the output can fit into one terminal length. This behavior can be
overridden by exporting your own flags (`export LESS=SRX`).
- **Yay is not asking me to edit PKGBUILDS, and I don't like the diff menu! What can I do?**
`yay --editmenu --nodiffmenu --save`
`yay --editmenu --diffmenu=false --save`
- **How can I tell Yay to act only on AUR packages, or only on repo packages?**
`yay -{OPERATION} --aur`
`yay -{OPERATION} --repo`
`yay -{OPERATION} --aur`
`yay -{OPERATION} --repo`
- **An `Out Of Date AUR Packages` message is displayed. Why doesn't Yay update them?**
- **A `Flagged Out Of Date AUR Packages` message is displayed. Why doesn't Yay update them?**
This message does not mean that updated AUR packages are available. It means
the packages have been flagged out of date on the AUR, but
their maintainers have not yet updated the `PKGBUILD`s
(see [outdated AUR packages](https://wiki.archlinux.org/index.php/Arch_User_Repository#Foo_in_the_AUR_is_outdated.3B_what_should_I_do.3F)).
This message does not mean that updated AUR packages are available. It means
the packages have been flagged out of date on the AUR, but
their maintainers have not yet updated the `PKGBUILD`s
(see [outdated AUR packages](https://wiki.archlinux.org/index.php/Arch_User_Repository#Foo_in_the_AUR_is_outdated.3B_what_should_I_do.3F)).
- **Yay doesn't install dependencies added to a PKGBUILD during installation.**
Yay resolves all dependencies ahead of time. You are free to edit the
PKGBUILD in any way, but any problems you cause are your own and should not be
reported unless they can be reproduced with the original PKGBUILD.
Yay resolves all dependencies ahead of time. You are free to edit the
PKGBUILD in any way, but any problems you cause are your own and should not be
reported unless they can be reproduced with the original PKGBUILD.
- **I know my `-git` package has updates but yay doesn't offer to update it**
Yay uses an hash cache for development packages. Normally it is updated at the end of the package install with the message `Found git repo`.
If you transition between aur helpers and did not install the devel package using yay at some point, it is possible it never got added to the cache. `yay -Y --gendb` will fix the current version of every devel package and start checking from there.
Yay uses a hash cache for development packages. Normally it is updated at the end of the package install with the message `Found git repo`.
If you transition between aur helpers and did not install the devel package using yay at some point, it is possible it never got added to the cache. `yay -Y --gendb` will fix the current version of every devel package and start checking from there.
- **I want to help out!**
Check [CONTRIBUTING.md](./CONTRIBUTING.md) for more information.
- **What settings do you use?**
```sh
yay -Y --devel --combinedupgrade --batchinstall --save
```
Pacman conf options:
```conf
UseSyslog
Color
CheckSpace
VerbosePkgLists
```
Check [CONTRIBUTING.md](./CONTRIBUTING.md) for more information.
## Support
@ -188,14 +172,14 @@ tools.
## Images
<p float="left">
<img src="https://rawcdn.githack.com/Jguer/jguer.github.io/77647f396cb7156fd32e30970dbeaf6d6dc7f983/yay/yay.png" width="42%"/>
<img src="https://rawcdn.githack.com/Jguer/jguer.github.io/77647f396cb7156fd32e30970dbeaf6d6dc7f983/yay/yay-s.png" width="42%"/>
<p align="center">
<img src="https://raw.githubusercontent.com/Jguer/jguer.github.io/refs/heads/master/yay/yay.png" width="42%">
<img src="https://raw.githubusercontent.com/Jguer/jguer.github.io/refs/heads/master/yay/yay-s.png" width="42%">
</p>
<p float="left">
<img src="https://rawcdn.githack.com/Jguer/jguer.github.io/77647f396cb7156fd32e30970dbeaf6d6dc7f983/yay/yay-y.png" width="42%"/>
<img src="https://rawcdn.githack.com/Jguer/jguer.github.io/77647f396cb7156fd32e30970dbeaf6d6dc7f983/yay/yay-ps.png" width="42%"/>
<p align="center">
<img src="https://raw.githubusercontent.com/Jguer/jguer.github.io/refs/heads/master/yay/yay-y.png" width="42%">
<img src="https://raw.githubusercontent.com/Jguer/jguer.github.io/refs/heads/master/yay/yay-ps.png" width="42%">
</p>
### Other AUR helpers/tools

13
SECURITY.md Normal file
View File

@ -0,0 +1,13 @@
# Security Policy
Thank you for helping keep yay secure!
## Supported Versions
We only provide security updates and support for the latest released version of yay. Please ensure you are using the most up-to-date version before reporting vulnerabilities.
## Reporting a Vulnerability
If you discover a security vulnerability, please email us at [security@jguer.space](mailto:security@jguer.space). We will respond as quickly as possible and coordinate a fix.
We appreciate responsible disclosure and your help in making this project safe for everyone.

View File

@ -1,125 +0,0 @@
package main
import (
"context"
"fmt"
"path/filepath"
"runtime"
"sync"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/multierror"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
)
type ErrDownloadSource struct {
inner error
pkgName string
errOut string
}
func (e ErrDownloadSource) Error() string {
return fmt.Sprintln(gotext.Get("error downloading sources: %s", text.Cyan(e.pkgName)),
"\n\t context:", e.inner.Error(), "\n\t", e.errOut)
}
func (e *ErrDownloadSource) Unwrap() error {
return e.inner
}
func downloadPKGBUILDSource(ctx context.Context, cmdBuilder exe.ICmdBuilder, dest,
base string, incompatible stringset.StringSet) (err error) {
dir := filepath.Join(dest, base)
args := []string{"--verifysource", "-Ccf"}
if incompatible.Get(base) {
args = append(args, "--ignorearch")
}
err = cmdBuilder.Show(
cmdBuilder.BuildMakepkgCmd(ctx, dir, args...))
if err != nil {
return ErrDownloadSource{inner: err, pkgName: base, errOut: ""}
}
return nil
}
func downloadPKGBUILDSourceWorker(ctx context.Context, wg *sync.WaitGroup, dest string,
cBase <-chan string, valOut chan<- string, errOut chan<- error,
cmdBuilder exe.ICmdBuilder, incompatible stringset.StringSet) {
for base := range cBase {
err := downloadPKGBUILDSource(ctx, cmdBuilder, dest, base, incompatible)
if err != nil {
errOut <- ErrDownloadSource{inner: err, pkgName: base, errOut: ""}
} else {
valOut <- base
}
}
wg.Done()
}
func downloadPKGBUILDSourceFanout(ctx context.Context, cmdBuilder exe.ICmdBuilder, dest string,
bases []dep.Base, incompatible stringset.StringSet, maxConcurrentDownloads int) error {
if len(bases) == 1 {
return downloadPKGBUILDSource(ctx, cmdBuilder, dest, bases[0].Pkgbase(), incompatible)
}
var (
numOfWorkers = runtime.NumCPU()
wg = &sync.WaitGroup{}
c = make(chan string)
fanInChanValues = make(chan string)
fanInChanErrors = make(chan error)
)
if maxConcurrentDownloads != 0 {
numOfWorkers = maxConcurrentDownloads
}
go func() {
for _, base := range bases {
c <- base.Pkgbase()
}
close(c)
}()
// Launch Workers
wg.Add(numOfWorkers)
for s := 0; s < numOfWorkers; s++ {
go downloadPKGBUILDSourceWorker(ctx, wg, dest, c,
fanInChanValues, fanInChanErrors, cmdBuilder, incompatible)
}
go func() {
wg.Wait()
close(fanInChanValues)
close(fanInChanErrors)
}()
returnErr := multierror.MultiError{}
receiver:
for {
select {
case _, ok := <-fanInChanValues:
if !ok {
break receiver
}
case err, ok := <-fanInChanErrors:
if !ok {
break receiver
}
returnErr.Add(err)
}
}
return returnErr.Return()
}

View File

@ -1,11 +1,15 @@
FROM docker.io/heywoodlh/archlinux:latest
FROM docker.io/ljmf00/archlinux:devel
LABEL maintainer="Jguer,docker@jguer.space"
ENV GO111MODULE=on
WORKDIR /app
RUN sed -i '/^\[community\]/,/^\[/ s/^/#/' /etc/pacman.conf
COPY go.mod .
RUN pacman -Sy && pacman -S --overwrite=* --noconfirm archlinux-keyring && pacman -Su --overwrite=* --needed --noconfirm go git && \
RUN pacman-key --init && pacman -Sy && pacman -S --overwrite=* --noconfirm archlinux-keyring && \
pacman -Su --overwrite=* --needed --noconfirm pacman doxygen meson asciidoc go git gcc make sudo base-devel && \
rm -rfv /var/cache/pacman/* /var/lib/pacman/sync/* && \
curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s v1.50.1 && \
go mod download
curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s v2.1.5 && \
go mod download

140
clean.go
View File

@ -2,68 +2,75 @@ package main
import (
"context"
"fmt"
"os"
"path/filepath"
"github.com/Jguer/aur"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
)
// CleanDependencies removes all dangling dependencies in system.
func cleanDependencies(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor, removeOptional bool) error {
func cleanDependencies(ctx context.Context, cfg *settings.Configuration,
cmdBuilder exe.ICmdBuilder, cmdArgs *parser.Arguments, dbExecutor db.Executor,
removeOptional bool,
) error {
hanging := hangingPackages(removeOptional, dbExecutor)
if len(hanging) != 0 {
return cleanRemove(ctx, cmdArgs, hanging)
return cleanRemove(ctx, cfg, cmdBuilder, cmdArgs, hanging)
}
return nil
}
// CleanRemove sends a full removal command to pacman with the pkgName slice.
func cleanRemove(ctx context.Context, cmdArgs *parser.Arguments, pkgNames []string) error {
func cleanRemove(ctx context.Context, cfg *settings.Configuration,
cmdBuilder exe.ICmdBuilder, cmdArgs *parser.Arguments, pkgNames []string,
) error {
if len(pkgNames) == 0 {
return nil
}
arguments := cmdArgs.CopyGlobal()
_ = arguments.AddArg("R")
if err := arguments.AddArg("R", "s", "u"); err != nil {
return err
}
arguments.AddTarget(pkgNames...)
return config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
arguments, config.Runtime.Mode, settings.NoConfirm))
return cmdBuilder.Show(
cmdBuilder.BuildPacmanCmd(ctx,
arguments, cfg.Mode, settings.NoConfirm))
}
func syncClean(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func syncClean(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
keepInstalled := false
keepCurrent := false
_, removeAll, _ := cmdArgs.GetArg("c", "clean")
for _, v := range config.Runtime.PacmanConf.CleanMethod {
if v == "KeepInstalled" {
for _, v := range run.PacmanConf.CleanMethod {
switch v {
case "KeepInstalled":
keepInstalled = true
} else if v == "KeepCurrent" {
case "KeepCurrent":
keepCurrent = true
}
}
if config.Runtime.Mode.AtLeastRepo() {
if err := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm)); err != nil {
if run.Cfg.Mode.AtLeastRepo() {
if err := run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm)); err != nil {
return err
}
}
if !config.Runtime.Mode.AtLeastAUR() {
if !run.Cfg.Mode.AtLeastAUR() {
return nil
}
@ -74,10 +81,10 @@ func syncClean(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Exe
question = gotext.Get("Do you want to remove all other AUR packages from cache?")
}
fmt.Println(gotext.Get("\nBuild directory:"), config.BuildDir)
run.Logger.Println(gotext.Get("\nBuild directory:"), run.Cfg.BuildDir)
if text.ContinueTask(os.Stdin, question, true, settings.NoConfirm) {
if err := cleanAUR(ctx, keepInstalled, keepCurrent, removeAll, dbExecutor); err != nil {
if run.Logger.ContinueTask(question, true, settings.NoConfirm) {
if err := cleanAUR(ctx, run, keepInstalled, keepCurrent, removeAll, dbExecutor); err != nil {
return err
}
}
@ -86,22 +93,24 @@ func syncClean(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Exe
return nil
}
if text.ContinueTask(os.Stdin, gotext.Get("Do you want to remove ALL untracked AUR files?"), true, settings.NoConfirm) {
return cleanUntracked(ctx)
if run.Logger.ContinueTask(gotext.Get("Do you want to remove ALL untracked AUR files?"), true, settings.NoConfirm) {
return cleanUntracked(ctx, run)
}
return nil
}
func cleanAUR(ctx context.Context, keepInstalled, keepCurrent, removeAll bool, dbExecutor db.Executor) error {
fmt.Println(gotext.Get("removing AUR packages from cache..."))
func cleanAUR(ctx context.Context, run *runtime.Runtime,
keepInstalled, keepCurrent, removeAll bool, dbExecutor db.Executor,
) error {
run.Logger.Println(gotext.Get("removing AUR packages from cache..."))
installedBases := make(stringset.StringSet)
inAURBases := make(stringset.StringSet)
installedBases := mapset.NewThreadUnsafeSet[string]()
inAURBases := mapset.NewThreadUnsafeSet[string]()
remotePackages, _ := query.GetRemotePackages(dbExecutor)
remotePackages := dbExecutor.InstalledRemotePackages()
files, err := os.ReadDir(config.BuildDir)
files, err := os.ReadDir(run.Cfg.BuildDir)
if err != nil {
return err
}
@ -121,21 +130,23 @@ func cleanAUR(ctx context.Context, keepInstalled, keepCurrent, removeAll bool, d
// Querying the AUR is slow and needs internet so don't do it if we
// don't need to.
if keepCurrent {
info, errInfo := query.AURInfo(ctx, config.Runtime.AURClient, cachedPackages, &query.AURWarnings{}, config.RequestSplitN)
info, errInfo := run.AURClient.Get(ctx, &aur.Query{
Needles: cachedPackages,
})
if errInfo != nil {
return errInfo
}
for _, pkg := range info {
inAURBases.Set(pkg.PackageBase)
for i := range info {
inAURBases.Add(info[i].PackageBase)
}
}
for _, pkg := range remotePackages {
if pkg.Base() != "" {
installedBases.Set(pkg.Base())
installedBases.Add(pkg.Base())
} else {
installedBases.Set(pkg.Name())
installedBases.Add(pkg.Name())
}
}
@ -145,28 +156,29 @@ func cleanAUR(ctx context.Context, keepInstalled, keepCurrent, removeAll bool, d
}
if !removeAll {
if keepInstalled && installedBases.Get(file.Name()) {
if keepInstalled && installedBases.Contains(file.Name()) {
continue
}
if keepCurrent && inAURBases.Get(file.Name()) {
if keepCurrent && inAURBases.Contains(file.Name()) {
continue
}
}
err = os.RemoveAll(filepath.Join(config.BuildDir, file.Name()))
if err != nil {
return nil
dir := filepath.Join(run.Cfg.BuildDir, file.Name())
run.Logger.Debugln("removing", dir)
if err = os.RemoveAll(dir); err != nil {
run.Logger.Warnln(gotext.Get("Unable to remove %s: %s", dir, err))
}
}
return nil
}
func cleanUntracked(ctx context.Context) error {
fmt.Println(gotext.Get("removing untracked AUR files from cache..."))
func cleanUntracked(ctx context.Context, run *runtime.Runtime) error {
run.Logger.Println(gotext.Get("removing untracked AUR files from cache..."))
files, err := os.ReadDir(config.BuildDir)
files, err := os.ReadDir(run.Cfg.BuildDir)
if err != nil {
return err
}
@ -176,11 +188,11 @@ func cleanUntracked(ctx context.Context) error {
continue
}
dir := filepath.Join(config.BuildDir, file.Name())
dir := filepath.Join(run.Cfg.BuildDir, file.Name())
run.Logger.Debugln("cleaning", dir)
if isGitRepository(dir) {
if err := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildGitCmd(ctx, dir, "clean", "-fx")); err != nil {
text.Warnln(gotext.Get("Unable to clean:"), dir)
if err := run.CmdBuilder.Show(run.CmdBuilder.BuildGitCmd(ctx, dir, "clean", "-fx")); err != nil {
run.Logger.Warnln(gotext.Get("Unable to clean:"), dir)
return err
}
}
@ -193,29 +205,3 @@ func isGitRepository(dir string) bool {
_, err := os.Stat(filepath.Join(dir, ".git"))
return !os.IsNotExist(err)
}
func cleanAfter(ctx context.Context, bases []dep.Base) {
fmt.Println(gotext.Get("removing untracked AUR files from cache..."))
for i, base := range bases {
dir := filepath.Join(config.BuildDir, base.Pkgbase())
if !isGitRepository(dir) {
continue
}
text.OperationInfoln(gotext.Get("Cleaning (%d/%d): %s", i+1, len(bases), text.Cyan(dir)))
_, stderr, err := config.Runtime.CmdBuilder.Capture(
config.Runtime.CmdBuilder.BuildGitCmd(
ctx, dir, "reset", "--hard", "HEAD"))
if err != nil {
text.Errorln(gotext.Get("error resetting %s: %s", base.String(), stderr))
}
if err := config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildGitCmd(
ctx, dir, "clean", "-fx", "--exclude='*.pkg.*'")); err != nil {
fmt.Fprintln(os.Stderr, err)
}
}
}

116
clean_test.go Normal file
View File

@ -0,0 +1,116 @@
//go:build !integration
// +build !integration
package main
import (
"context"
"fmt"
"os/exec"
"strings"
"testing"
"github.com/Jguer/go-alpm/v2"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/Jguer/yay/v12/pkg/db/mock"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
)
func TestCleanHanging(t *testing.T) {
pacmanBin := t.TempDir() + "/pacman"
t.Parallel()
testCases := []struct {
name string
args []string
wantShow []string
}{
{
name: "clean",
args: []string{"Y", "c"},
wantShow: []string{"pacman", "-R", "-s", "-u", "--config", "/etc/pacman.conf", "--", "lsp-plugins"},
},
{
name: "clean double",
args: []string{"Y", "c", "c"},
wantShow: []string{"pacman", "-R", "-s", "-u", "--config", "/etc/pacman.conf", "--", "lsp-plugins", "linux-headers"},
},
}
dbExc := &mock.DBExecutor{
PackageOptionalDependsFn: func(i alpm.IPackage) []alpm.Depend {
if i.Name() == "linux" {
return []alpm.Depend{
{
Name: "linux-headers",
},
}
}
return []alpm.Depend{}
},
PackageProvidesFn: func(p alpm.IPackage) []alpm.Depend { return []alpm.Depend{} },
PackageDependsFn: func(p alpm.IPackage) []alpm.Depend { return []alpm.Depend{} },
LocalPackagesFn: func() []mock.IPackage {
return []mock.IPackage{
&mock.Package{
PReason: alpm.PkgReasonExplicit,
PName: "linux",
},
&mock.Package{
PReason: alpm.PkgReasonDepend,
PName: "lsp-plugins",
},
&mock.Package{
PReason: alpm.PkgReasonDepend,
PName: "linux-headers",
},
}
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
mockRunner := &exe.MockRunner{
CaptureFn: func(cmd *exec.Cmd) (stdout string, stderr string, err error) {
return "", "", nil
},
ShowFn: func(cmd *exec.Cmd) error { return nil },
}
cmdBuilder := &exe.CmdBuilder{
SudoBin: "su",
PacmanBin: pacmanBin,
PacmanConfigPath: "/etc/pacman.conf",
GitBin: "git",
Runner: mockRunner,
SudoLoopEnabled: false,
}
run := &runtime.Runtime{CmdBuilder: cmdBuilder, Cfg: &settings.Configuration{}}
cmdArgs := parser.MakeArguments()
cmdArgs.AddArg(tc.args...)
err := handleCmd(context.Background(),
run, cmdArgs, dbExc,
)
require.NoError(t, err)
for i, call := range mockRunner.ShowCalls {
show := call.Args[0].(*exec.Cmd).String()
show = strings.ReplaceAll(show, pacmanBin, "pacman")
// options are in a different order on different systems and on CI root user is used
assert.Subset(t, strings.Split(show, " "),
strings.Split(tc.wantShow[i], " "),
fmt.Sprintf("%d - %s", i, show))
}
})
}
}

286
cmd.go
View File

@ -6,27 +6,28 @@ import (
"errors"
"fmt"
"net/http"
"os"
"strings"
alpm "github.com/Jguer/go-alpm/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/completion"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/download"
"github.com/Jguer/yay/v11/pkg/intrange"
"github.com/Jguer/yay/v11/pkg/news"
"github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v11/pkg/upgrade"
"github.com/Jguer/yay/v11/pkg/vcs"
"github.com/Jguer/yay/v12/pkg/completion"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/download"
"github.com/Jguer/yay/v12/pkg/intrange"
"github.com/Jguer/yay/v12/pkg/news"
"github.com/Jguer/yay/v12/pkg/query"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/yay/v12/pkg/upgrade"
"github.com/Jguer/yay/v12/pkg/vcs"
)
func usage() {
fmt.Println(`Usage:
func usage(logger *text.Logger) {
logger.Println(`Usage:
yay
yay <operation> [...]
yay <package(s)>
@ -43,15 +44,17 @@ operations:
yay {-U --upgrade} [options] <file(s)>
New operations:
yay {-Y --yay} [options] [package(s)]
yay {-P --show} [options]
yay {-B --build} [options] [dir]
yay {-G --getpkgbuild} [options] [package(s)]
yay {-P --show} [options]
yay {-W --web} [options] [package(s)]
yay {-Y --yay} [options] [package(s)]
If no arguments are provided 'yay -Syu' will be performed
If no operation is provided -Y will be assumed
If no operation is specified 'yay -Syu' will be performed
If no operation is specified and targets are provided -Y will be assumed
New options:
--repo Assume targets are from the repositories
-N --repo Assume targets are from the repositories
-a --aur Assume targets are from the AUR
Permanent configuration options:
@ -89,24 +92,19 @@ Permanent configuration options:
--cleanmenu Give the option to clean build PKGBUILDS
--diffmenu Give the option to show diffs for build files
--editmenu Give the option to edit/view PKGBUILDS
--upgrademenu Show a detailed list of updates with the option to skip any
--nocleanmenu Don't clean build PKGBUILDS
--nodiffmenu Don't show diffs for build files
--noeditmenu Don't edit/view PKGBUILDS
--noupgrademenu Don't show the upgrade menu
--askremovemake Ask to remove makedepends after install
--askyesremovemake Ask to remove makedepends after install("Y" as default)
--removemake Remove makedepends after install
--noremovemake Don't remove makedepends after install
--cleanafter Remove package sources after successful install
--nocleanafter Do not remove package sources after successful build
--keepsrc Keep pkg/ and src/ after building packages
--bottomup Shows AUR's packages first and then repository's
--topdown Shows repository's packages first and then AUR's
--singlelineresults List each search result on its own line
--doublelineresults List each search result on two lines, like pacman
--devel Check development packages during sysupgrade
--nodevel Do not check development packages
--rebuild Always build target packages
--rebuildall Always build all AUR packages
--norebuild Skip package build if in cache and up to date
@ -115,23 +113,14 @@ Permanent configuration options:
--noredownload Skip pkgbuild download if in cache and up to date
--redownloadall Always download pkgbuilds of all AUR packages
--provides Look for matching providers when searching for packages
--noprovides Just look for packages by pkgname
--pgpfetch Prompt to import PGP keys from PKGBUILDs
--nopgpfetch Don't prompt to import PGP keys
--useask Automatically resolve conflicts using pacman's ask flag
--nouseask Confirm conflicts manually during the install
--combinedupgrade Refresh then perform the repo and AUR upgrade together
--nocombinedupgrade Perform the repo upgrade and AUR upgrade separately
--batchinstall Build multiple AUR packages then install them together
--nobatchinstall Build and install each AUR package one by one
--sudo <file> sudo command to use
--sudoflags <flags> Pass arguments to sudo
--sudoloop Loop sudo calls in the background to avoid timeout
--nosudoloop Do not loop sudo calls in the background
--timeupdate Check packages' AUR page for changes during sysupgrade
--notimeupdate Do not check packages' AUR page for changes
show specific options:
-c --complete Used for completions
@ -141,7 +130,7 @@ show specific options:
-w --news Print arch news
yay specific options:
-c --clean Remove unneeded dependencies
-c --clean Remove unneeded dependencies (-cc to ignore optdepends)
--gendb Generates development package DB used for updating
getpkgbuild specific options:
@ -149,46 +138,49 @@ getpkgbuild specific options:
-p --print Print pkgbuild of packages`)
}
func handleCmd(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func handleCmd(ctx context.Context, run *runtime.Runtime,
cmdArgs *parser.Arguments, dbExecutor db.Executor,
) error {
if cmdArgs.ExistsArg("h", "help") {
return handleHelp(ctx, cmdArgs)
return handleHelp(ctx, run, cmdArgs)
}
if config.SudoLoop && cmdArgs.NeedRoot(config.Runtime.Mode) {
config.Runtime.CmdBuilder.SudoLoop()
if run.Cfg.SudoLoop && cmdArgs.NeedRoot(run.Cfg.Mode) {
run.CmdBuilder.SudoLoop()
}
switch cmdArgs.Op {
case "V", "version":
handleVersion()
handleVersion(run.Logger)
return nil
case "D", "database":
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
case "F", "files":
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
case "Q", "query":
return handleQuery(ctx, cmdArgs, dbExecutor)
return handleQuery(ctx, run, cmdArgs, dbExecutor)
case "R", "remove":
return handleRemove(ctx, cmdArgs, config.Runtime.VCSStore)
return handleRemove(ctx, run, cmdArgs, run.VCSStore)
case "S", "sync":
return handleSync(ctx, cmdArgs, dbExecutor)
return handleSync(ctx, run, cmdArgs, dbExecutor)
case "T", "deptest":
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
case "U", "upgrade":
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return handleUpgrade(ctx, run, cmdArgs)
case "B", "build":
return handleBuild(ctx, run, dbExecutor, cmdArgs)
case "G", "getpkgbuild":
return handleGetpkgbuild(ctx, cmdArgs, dbExecutor)
return handleGetpkgbuild(ctx, run, cmdArgs, dbExecutor)
case "P", "show":
return handlePrint(ctx, cmdArgs, dbExecutor)
return handlePrint(ctx, run, cmdArgs, dbExecutor)
case "Y", "yay":
return handleYay(ctx, cmdArgs, dbExecutor, config.Runtime.QueryBuilder)
return handleYay(ctx, run, cmdArgs, run.CmdBuilder,
dbExecutor, run.QueryBuilder)
case "W", "web":
return handleWeb(ctx, cmdArgs)
return handleWeb(ctx, run, cmdArgs)
}
return errors.New(gotext.Get("unhandled operation"))
@ -204,32 +196,33 @@ func getFilter(cmdArgs *parser.Arguments) (upgrade.Filter, error) {
case deps && explicit:
return nil, errors.New(gotext.Get("invalid option: '--deps' and '--explicit' may not be used together"))
case deps:
return func(pkg upgrade.Upgrade) bool {
return func(pkg *upgrade.Upgrade) bool {
return pkg.Reason == alpm.PkgReasonDepend
}, nil
case explicit:
return func(pkg upgrade.Upgrade) bool {
return func(pkg *upgrade.Upgrade) bool {
return pkg.Reason == alpm.PkgReasonExplicit
}, nil
}
return func(pkg upgrade.Upgrade) bool {
return func(pkg *upgrade.Upgrade) bool {
return true
}, nil
}
func handleQuery(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func handleQuery(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
if cmdArgs.ExistsArg("u", "upgrades") {
filter, err := getFilter(cmdArgs)
if err != nil {
return err
}
return printUpdateList(ctx, cmdArgs, dbExecutor, cmdArgs.ExistsDouble("u", "sysupgrade"), filter)
return printUpdateList(ctx, run, cmdArgs, dbExecutor,
cmdArgs.ExistsDouble("u", "sysupgrade"), filter)
}
if err := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm)); err != nil {
if err := run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm)); err != nil {
if str := err.Error(); strings.Contains(str, "exit status") {
// yay -Qdt should not output anything in case of error
return fmt.Errorf("")
@ -241,138 +234,153 @@ func handleQuery(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.E
return nil
}
func handleHelp(ctx context.Context, cmdArgs *parser.Arguments) error {
func handleHelp(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments) error {
usage(run.Logger)
switch cmdArgs.Op {
case "Y", "yay", "G", "getpkgbuild", "P", "show":
usage()
case "Y", "yay", "G", "getpkgbuild", "P", "show", "W", "web", "B", "build":
return nil
}
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
run.Logger.Println("\npacman operation specific options:")
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
}
func handleVersion() {
fmt.Printf("yay v%s - libalpm v%s\n", yayVersion, alpm.Version())
func handleVersion(logger *text.Logger) {
logger.Printf("yay v%s - libalpm v%s\n", yayVersion, alpm.Version())
}
func handlePrint(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func handlePrint(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
switch {
case cmdArgs.ExistsArg("d", "defaultconfig"):
tmpConfig := settings.DefaultConfig(yayVersion)
fmt.Printf("%v", tmpConfig)
run.Logger.Printf("%v", tmpConfig)
return nil
case cmdArgs.ExistsArg("g", "currentconfig"):
fmt.Printf("%v", config)
run.Logger.Printf("%v", run.Cfg)
return nil
case cmdArgs.ExistsArg("n", "numberupgrades"):
filter, err := getFilter(cmdArgs)
if err != nil {
return err
}
return printNumberOfUpdates(ctx, dbExecutor, cmdArgs.ExistsDouble("u", "sysupgrade"), filter)
case cmdArgs.ExistsArg("w", "news"):
double := cmdArgs.ExistsDouble("w", "news")
quiet := cmdArgs.ExistsArg("q", "quiet")
return news.PrintNewsFeed(ctx, config.Runtime.HTTPClient, dbExecutor.LastBuildTime(), config.BottomUp, double, quiet)
case cmdArgs.ExistsDouble("c", "complete"):
return completion.Show(ctx, config.Runtime.HTTPClient, dbExecutor,
config.AURURL, config.Runtime.CompletionPath, config.CompletionInterval, true)
return news.PrintNewsFeed(ctx, run.HTTPClient, run.Logger,
dbExecutor.LastBuildTime(), run.Cfg.BottomUp, double, quiet)
case cmdArgs.ExistsArg("c", "complete"):
return completion.Show(ctx, config.Runtime.HTTPClient, dbExecutor,
config.AURURL, config.Runtime.CompletionPath, config.CompletionInterval, false)
return completion.Show(ctx, run.HTTPClient, dbExecutor,
run.Cfg.AURURL, run.Cfg.CompletionPath, run.Cfg.CompletionInterval, cmdArgs.ExistsDouble("c", "complete"))
case cmdArgs.ExistsArg("s", "stats"):
return localStatistics(ctx, dbExecutor)
return localStatistics(ctx, run, dbExecutor)
}
return nil
}
func handleYay(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor, queryBuilder query.Builder) error {
func handleYay(ctx context.Context, run *runtime.Runtime,
cmdArgs *parser.Arguments, cmdBuilder exe.ICmdBuilder,
dbExecutor db.Executor, queryBuilder query.Builder,
) error {
switch {
case cmdArgs.ExistsArg("gendb"):
return createDevelDB(ctx, config, dbExecutor)
return createDevelDB(ctx, run, dbExecutor)
case cmdArgs.ExistsDouble("c"):
return cleanDependencies(ctx, cmdArgs, dbExecutor, true)
return cleanDependencies(ctx, run.Cfg, cmdBuilder, cmdArgs, dbExecutor, true)
case cmdArgs.ExistsArg("c", "clean"):
return cleanDependencies(ctx, cmdArgs, dbExecutor, false)
return cleanDependencies(ctx, run.Cfg, cmdBuilder, cmdArgs, dbExecutor, false)
case len(cmdArgs.Targets) > 0:
return displayNumberMenu(ctx, cmdArgs.Targets, dbExecutor, queryBuilder, cmdArgs)
return displayNumberMenu(ctx, run, cmdArgs.Targets, dbExecutor, queryBuilder, cmdArgs)
}
return nil
}
func handleWeb(ctx context.Context, cmdArgs *parser.Arguments) error {
func handleWeb(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments) error {
switch {
case cmdArgs.ExistsArg("v", "vote"):
return handlePackageVote(ctx, cmdArgs.Targets, config.Runtime.AURClient,
config.Runtime.VoteClient, config.RequestSplitN, true)
return handlePackageVote(ctx, cmdArgs.Targets, run.AURClient, run.Logger,
run.VoteClient, true)
case cmdArgs.ExistsArg("u", "unvote"):
return handlePackageVote(ctx, cmdArgs.Targets, config.Runtime.AURClient,
config.Runtime.VoteClient, config.RequestSplitN, false)
return handlePackageVote(ctx, cmdArgs.Targets, run.AURClient, run.Logger,
run.VoteClient, false)
}
return nil
}
func handleGetpkgbuild(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor download.DBSearcher) error {
func handleGetpkgbuild(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, dbExecutor download.DBSearcher) error {
if cmdArgs.ExistsArg("p", "print") {
return printPkgbuilds(dbExecutor, config.Runtime.HTTPClient, cmdArgs.Targets, config.Runtime.Mode, config.AURURL)
return printPkgbuilds(dbExecutor, run.AURClient,
run.HTTPClient, run.Logger, cmdArgs.Targets, run.Cfg.Mode, run.Cfg.AURURL)
}
return getPkgbuilds(ctx, dbExecutor, config, cmdArgs.Targets, cmdArgs.ExistsArg("f", "force"))
return getPkgbuilds(ctx, dbExecutor, run.AURClient, run,
cmdArgs.Targets, cmdArgs.ExistsArg("f", "force"))
}
func handleSync(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func handleUpgrade(ctx context.Context,
run *runtime.Runtime, cmdArgs *parser.Arguments,
) error {
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
}
// -B* options
func handleBuild(ctx context.Context,
run *runtime.Runtime, dbExecutor db.Executor, cmdArgs *parser.Arguments,
) error {
if cmdArgs.ExistsArg("i", "install") {
return installLocalPKGBUILD(ctx, run, cmdArgs, dbExecutor)
}
return nil
}
func handleSync(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
targets := cmdArgs.Targets
switch {
case cmdArgs.ExistsArg("s", "search"):
return syncSearch(ctx, targets, config.Runtime.AURClient, dbExecutor, config.Runtime.QueryBuilder, !cmdArgs.ExistsArg("q", "quiet"))
return syncSearch(ctx, targets, dbExecutor, run.QueryBuilder, !cmdArgs.ExistsArg("q", "quiet"))
case cmdArgs.ExistsArg("p", "print", "print-format"):
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
case cmdArgs.ExistsArg("c", "clean"):
return syncClean(ctx, cmdArgs, dbExecutor)
return syncClean(ctx, run, cmdArgs, dbExecutor)
case cmdArgs.ExistsArg("l", "list"):
return syncList(ctx, config.Runtime.HTTPClient, cmdArgs, dbExecutor)
return syncList(ctx, run, run.HTTPClient, cmdArgs, dbExecutor)
case cmdArgs.ExistsArg("g", "groups"):
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
case cmdArgs.ExistsArg("i", "info"):
return syncInfo(ctx, cmdArgs, targets, dbExecutor)
return syncInfo(ctx, run, cmdArgs, targets, dbExecutor)
case cmdArgs.ExistsArg("u", "sysupgrade") || len(cmdArgs.Targets) > 0:
return install(ctx, cmdArgs, dbExecutor, false)
return syncInstall(ctx, run, cmdArgs, dbExecutor)
case cmdArgs.ExistsArg("y", "refresh"):
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
}
return nil
}
func handleRemove(ctx context.Context, cmdArgs *parser.Arguments, localCache *vcs.InfoStore) error {
err := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
func handleRemove(ctx context.Context, run *runtime.Runtime, cmdArgs *parser.Arguments, localCache vcs.Store) error {
err := run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
if err == nil {
localCache.RemovePackage(cmdArgs.Targets)
localCache.RemovePackages(cmdArgs.Targets)
}
return err
}
// NumberMenu presents a CLI for selecting packages to install.
func displayNumberMenu(ctx context.Context, pkgS []string, dbExecutor db.Executor,
func displayNumberMenu(ctx context.Context, run *runtime.Runtime, pkgS []string, dbExecutor db.Executor,
queryBuilder query.Builder, cmdArgs *parser.Arguments,
) error {
queryBuilder.Execute(ctx, dbExecutor, config.Runtime.AURClient, pkgS)
queryBuilder.Execute(ctx, dbExecutor, pkgS)
if err := queryBuilder.Results(os.Stdout, dbExecutor, query.NumberMenu); err != nil {
if err := queryBuilder.Results(dbExecutor, query.NumberMenu); err != nil {
return err
}
@ -381,9 +389,9 @@ func displayNumberMenu(ctx context.Context, pkgS []string, dbExecutor db.Executo
return nil
}
text.Infoln(gotext.Get("Packages to install (eg: 1 2 3, 1-3 or ^4)"))
run.Logger.Infoln(gotext.Get("Packages to install (eg: 1 2 3, 1-3 or ^4)"))
numberBuf, err := text.GetInput("", false)
numberBuf, err := run.Logger.GetInput("", false)
if err != nil {
return err
}
@ -395,29 +403,31 @@ func displayNumberMenu(ctx context.Context, pkgS []string, dbExecutor db.Executo
return err
}
arguments := cmdArgs.CopyGlobal()
arguments.AddTarget(targets...)
// modify the arguments to pass for the install
cmdArgs.Targets = targets
if len(arguments.Targets) == 0 {
fmt.Println(gotext.Get(" there is nothing to do"))
if len(cmdArgs.Targets) == 0 {
run.Logger.Println(gotext.Get(" there is nothing to do"))
return nil
}
return install(ctx, arguments, dbExecutor, true)
return syncInstall(ctx, run, cmdArgs, dbExecutor)
}
func syncList(ctx context.Context, httpClient *http.Client, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
func syncList(ctx context.Context, run *runtime.Runtime,
httpClient *http.Client, cmdArgs *parser.Arguments, dbExecutor db.Executor,
) error {
aur := false
for i := len(cmdArgs.Targets) - 1; i >= 0; i-- {
if cmdArgs.Targets[i] == "aur" && config.Runtime.Mode.AtLeastAUR() {
if cmdArgs.Targets[i] == "aur" && run.Cfg.Mode.AtLeastAUR() {
cmdArgs.Targets = append(cmdArgs.Targets[:i], cmdArgs.Targets[i+1:]...)
aur = true
}
}
if config.Runtime.Mode.AtLeastAUR() && (len(cmdArgs.Targets) == 0 || aur) {
req, err := http.NewRequestWithContext(ctx, http.MethodGet, config.AURURL+"/packages.gz", http.NoBody)
if run.Cfg.Mode.AtLeastAUR() && (len(cmdArgs.Targets) == 0 || aur) {
req, err := http.NewRequestWithContext(ctx, http.MethodGet, run.Cfg.AURURL+"/packages.gz", http.NoBody)
if err != nil {
return err
}
@ -435,22 +445,22 @@ func syncList(ctx context.Context, httpClient *http.Client, cmdArgs *parser.Argu
for scanner.Scan() {
name := scanner.Text()
if cmdArgs.ExistsArg("q", "quiet") {
fmt.Println(name)
run.Logger.Println(name)
} else {
fmt.Printf("%s %s %s", text.Magenta("aur"), text.Bold(name), text.Bold(text.Green(gotext.Get("unknown-version"))))
run.Logger.Printf("%s %s %s", text.Magenta("aur"), text.Bold(name), text.Bold(text.Green(gotext.Get("unknown-version"))))
if dbExecutor.LocalPackage(name) != nil {
fmt.Print(text.Bold(text.Blue(gotext.Get(" [Installed]"))))
run.Logger.Print(text.Bold(text.Blue(gotext.Get(" [Installed]"))))
}
fmt.Println()
run.Logger.Println()
}
}
}
if config.Runtime.Mode.AtLeastRepo() && (len(cmdArgs.Targets) != 0 || !aur) {
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
if run.Cfg.Mode.AtLeastRepo() && (len(cmdArgs.Targets) != 0 || !aur) {
return run.CmdBuilder.Show(run.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, run.Cfg.Mode, settings.NoConfirm))
}
return nil

140
cmd_test.go Normal file
View File

@ -0,0 +1,140 @@
//go:build !integration
// +build !integration
package main
import (
"context"
"fmt"
"io"
"os"
"os/exec"
"strings"
"testing"
"github.com/Jguer/aur"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/Jguer/yay/v12/pkg/db/mock"
mockaur "github.com/Jguer/yay/v12/pkg/dep/mock"
"github.com/Jguer/yay/v12/pkg/query"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/yay/v12/pkg/vcs"
)
func TestYogurtMenuAURDB(t *testing.T) {
t.Skip("skip until Operation service is an interface")
t.Parallel()
makepkgBin := t.TempDir() + "/makepkg"
pacmanBin := t.TempDir() + "/pacman"
gitBin := t.TempDir() + "/git"
f, err := os.OpenFile(makepkgBin, os.O_RDONLY|os.O_CREATE, 0o755)
require.NoError(t, err)
require.NoError(t, f.Close())
f, err = os.OpenFile(pacmanBin, os.O_RDONLY|os.O_CREATE, 0o755)
require.NoError(t, err)
require.NoError(t, f.Close())
f, err = os.OpenFile(gitBin, os.O_RDONLY|os.O_CREATE, 0o755)
require.NoError(t, err)
require.NoError(t, f.Close())
captureOverride := func(cmd *exec.Cmd) (stdout string, stderr string, err error) {
return "", "", nil
}
showOverride := func(cmd *exec.Cmd) error {
return nil
}
mockRunner := &exe.MockRunner{CaptureFn: captureOverride, ShowFn: showOverride}
cmdBuilder := &exe.CmdBuilder{
MakepkgBin: makepkgBin,
SudoBin: "su",
PacmanBin: pacmanBin,
PacmanConfigPath: "/etc/pacman.conf",
GitBin: "git",
Runner: mockRunner,
SudoLoopEnabled: false,
}
cmdArgs := parser.MakeArguments()
cmdArgs.AddArg("Y")
cmdArgs.AddTarget("yay")
db := &mock.DBExecutor{
AlpmArchitecturesFn: func() ([]string, error) {
return []string{"x86_64"}, nil
},
RefreshHandleFn: func() error {
return nil
},
ReposFn: func() []string {
return []string{"aur"}
},
SyncPackagesFn: func(s ...string) []mock.IPackage {
return []mock.IPackage{
&mock.Package{
PName: "yay",
PBase: "yay",
PVersion: "10.0.0",
PDB: mock.NewDB("aur"),
},
}
},
LocalPackageFn: func(s string) mock.IPackage {
return nil
},
}
aurCache := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{
{
Name: "yay",
PackageBase: "yay",
Version: "10.0.0",
},
}, nil
},
}
logger := text.NewLogger(io.Discard, os.Stderr, strings.NewReader("1\n"), true, "test")
run := &runtime.Runtime{
Cfg: &settings.Configuration{
RemoveMake: "no",
},
Logger: logger,
CmdBuilder: cmdBuilder,
VCSStore: &vcs.Mock{},
QueryBuilder: query.NewSourceQueryBuilder(aurCache, logger, "votes", parser.ModeAny, "name",
true, false, true),
AURClient: aurCache,
}
err = handleCmd(context.Background(), run, cmdArgs, db)
require.NoError(t, err)
wantCapture := []string{}
wantShow := []string{
"pacman -S -y --config /etc/pacman.conf --",
"pacman -S -y -u --config /etc/pacman.conf --",
}
require.Len(t, mockRunner.ShowCalls, len(wantShow))
require.Len(t, mockRunner.CaptureCalls, len(wantCapture))
for i, call := range mockRunner.ShowCalls {
show := call.Args[0].(*exec.Cmd).String()
show = strings.ReplaceAll(show, makepkgBin, "makepkg")
show = strings.ReplaceAll(show, pacmanBin, "pacman")
show = strings.ReplaceAll(show, gitBin, "pacman")
// options are in a different order on different systems and on CI root user is used
assert.Subset(t, strings.Split(show, " "), strings.Split(wantShow[i], " "), fmt.Sprintf("%d - %s", i, show))
}
}

View File

@ -61,22 +61,22 @@ _yay() {
search unrequired upgrades' 'c e g i k l m n o p s t u')
remove=('cascade dbonly nodeps assume-installed nosave print recursive unneeded' 'c n p s u')
sync=('asdeps asexplicit clean dbonly downloadonly overwrite groups ignore ignoregroup
info list needed nodeps assume-installed print refresh recursive search sysupgrade'
'c g i l p s u w y')
info list needed nodeps assume-installed print refresh recursive search sysupgrade aur repo'
'c g i l p s u w y a N')
upgrade=('asdeps asexplicit overwrite needed nodeps assume-installed print recursive' 'p')
core=('database files help query remove sync upgrade version' 'D F Q R S U V h')
##yay stuff
common=('arch cachedir color config confirm dbpath debug gpgdir help hookdir logfile
noconfirm noprogressbar noscriptlet quiet root verbose
makepkg pacman git gpg gpgflags config requestsplitn sudoloop nosudoloop
makepkg pacman git gpg gpgflags config requestsplitn sudoloop
redownload noredownload redownloadall rebuild rebuildall rebuildtree norebuild sortby
singlelineresults doublelineresults answerclean answerdiff answeredit answerupgrade noanswerclean noanswerdiff
noansweredit noanswerupgrade cleanmenu diffmenu editmenu upgrademenu cleanafter nocleanafter
nocleanmenu nodiffmenu noupgrademenu provides noprovides pgpfetch nopgpfetch
useask nouseask combinedupgrade nocombinedupgrade aur repo makepkgconf
nomakepkgconf askremovemake removemake noremovemake completioninterval aururl aurrpcurl
searchby batchinstall nobatchinstall'
noansweredit noanswerupgrade cleanmenu diffmenu editmenu cleanafter keepsrc
provides pgpfetch
useask combinedupgrade aur repo makepkgconf
nomakepkgconf askremovemake askyesremovemake removemake noremovemake completioninterval aururl aurrpcurl
searchby batchinstall'
'b d h q r v')
yays=('clean gendb' 'c')
show=('complete defaultconfig currentconfig stats news' 'c d g s w')

View File

@ -165,8 +165,8 @@ complete -c $progname -n "$webspecific" -s u -l unvote -d 'Unvote for AUR packag
complete -c $progname -n "$webspecific" -xa "$listall"
# New options
complete -c $progname -n "not $noopt" -l repo -d 'Assume targets are from the AUR' -f
complete -c $progname -n "not $noopt" -s a -l aur -d 'Assume targets are from the repositories' -f
complete -c $progname -n "not $noopt" -s a -l aur -d 'Assume targets are from the AUR' -f
complete -c $progname -n "not $noopt" -s N -l repo -d 'Assume targets are from the repositories' -f
# Yay options
complete -c $progname -n "$yayspecific" -s c -l clean -d 'Remove unneeded dependencies' -f
@ -216,12 +216,8 @@ complete -c $progname -n "not $noopt" -l noanswerupgrade -d 'Unset the answer fo
complete -c $progname -n "not $noopt" -l cleanmenu -d 'Give the option to clean build PKGBUILDS' -f
complete -c $progname -n "not $noopt" -l diffmenu -d 'Give the option to show diffs for build files' -f
complete -c $progname -n "not $noopt" -l editmenu -d 'Give the option to edit/view PKGBUILDS' -f
complete -c $progname -n "not $noopt" -l upgrademenu -d 'Show a detailed list of updates with the option to skip any' -f
complete -c $progname -n "not $noopt" -l nocleanmenu -d 'Do not clean build PKGBUILDS' -f
complete -c $progname -n "not $noopt" -l nodiffmenu -d 'Do not show diffs for build files' -f
complete -c $progname -n "not $noopt" -l noeditmenu -d 'Do not edit/view PKGBUILDS' -f
complete -c $progname -n "not $noopt" -l noupgrademenu -d 'Do not show the upgrade menu' -f
complete -c $progname -n "not $noopt" -l askremovemake -d 'Ask to remove make deps after install' -f
complete -c $progname -n "not $noopt" -l askyesremovemake -d 'Ask to remove make deps after install(with "Y" as default)' -f
complete -c $progname -n "not $noopt" -l removemake -d 'Remove make deps after install' -f
complete -c $progname -n "not $noopt" -l noremovemake -d 'Do not remove make deps after install' -f
complete -c $progname -n "not $noopt" -l topdown -d 'Shows repository packages first and then aur' -f
@ -229,24 +225,17 @@ complete -c $progname -n "not $noopt" -l bottomup -d 'Shows aur packages first a
complete -c $progname -n "not $noopt" -l singlelineresults -d 'List each search result on its own line' -f
complete -c $progname -n "not $noopt" -l doublelineresults -d 'List each search result on two lines, like pacman' -f
complete -c $progname -n "not $noopt" -l devel -d 'Check -git/-svn/-hg development version' -f
complete -c $progname -n "not $noopt" -l nodevel -d 'Disable development version checking' -f
complete -c $progname -n "not $noopt" -l cleanafter -d 'Clean package sources after successful build' -f
complete -c $progname -n "not $noopt" -l nocleanafter -d 'Disable package sources cleaning' -f
complete -c $progname -n "not $noopt" -l keepsrc -d 'Keep pkg/ and src/ after building packages' -f
complete -c $progname -n "not $noopt" -l timeupdate -d 'Check package modification date and version' -f
complete -c $progname -n "not $noopt" -l notimeupdate -d 'Check only package version change' -f
complete -c $progname -n "not $noopt" -l redownload -d 'Redownload PKGBUILD of package even if up-to-date' -f
complete -c $progname -n "not $noopt" -l redownloadall -d 'Redownload PKGBUILD of package and deps even if up-to-date' -f
complete -c $progname -n "not $noopt" -l noredownload -d 'Do not redownload up-to-date PKGBUILDs' -f
complete -c $progname -n "not $noopt" -l provides -d 'Look for matching providers when searching for packages' -f
complete -c $progname -n "not $noopt" -l noprovides -d 'Just look for packages by pkgname' -f
complete -c $progname -n "not $noopt" -l pgpfetch -d 'Prompt to import PGP keys from PKGBUILDs' -f
complete -c $progname -n "not $noopt" -l nopgpfetch -d 'Do not prompt to import PGP keys' -f
complete -c $progname -n "not $noopt" -l useask -d 'Automatically resolve conflicts using pacmans ask flag' -f
complete -c $progname -n "not $noopt" -l nouseask -d 'Confirm conflicts manually during the install' -f
complete -c $progname -n "not $noopt" -l combinedupgrade -d 'Refresh then perform the repo and AUR upgrade together' -f
complete -c $progname -n "not $noopt" -l nocombinedupgrade -d 'Perform the repo upgrade and AUR upgrade separately' -f
complete -c $progname -n "not $noopt" -l batchinstall -d 'Build multiple AUR packages then install them together' -f
complete -c $progname -n "not $noopt" -l nobatchinstall -d 'Build and install each AUR package one by one' -f
complete -c $progname -n "not $noopt" -l rebuild -d 'Always build target packages' -f
complete -c $progname -n "not $noopt" -l rebuildall -d 'Always build all AUR packages' -f
complete -c $progname -n "not $noopt" -l rebuildtree -d 'Always build all AUR packages even if installed' -f
@ -254,4 +243,3 @@ complete -c $progname -n "not $noopt" -l norebuild -d 'Skip package build if in
complete -c $progname -n "not $noopt" -l mflags -d 'Pass the following options to makepkg' -f
complete -c $progname -n "not $noopt" -l gpgflags -d 'Pass the following options to gpg' -f
complete -c $progname -n "not $noopt" -l sudoloop -d 'Loop sudo calls in the background to avoid timeout' -f
complete -c $progname -n "not $noopt" -l nosudoloop -d 'Do not loop sudo calls in the background' -f

View File

@ -1,5 +1,5 @@
#compdef yay
# vim:fdm=marker foldlevel=0 tabstop=2 shiftwidth=2 filetype=zsh
# vim:tabstop=2 shiftwidth=2 filetype=zsh
typeset -A opt_args
setopt extendedglob
@ -23,7 +23,7 @@ _pacman_opts_commands=(
# options for passing to _arguments: options common to all commands
_pacman_opts_common=(
'--repo[Assume targets are from the repositories]'
{-N,--repo}'[Assume targets are from the repositories]'
{-a,--aur}'[Assume targets are from the AUR]'
'--aururl[Set an alternative AUR URL]:url'
'--aurrpcurl[Set an alternative URL for the AUR /rpc endpoint]:url'
@ -70,12 +70,8 @@ _pacman_opts_common=(
'--cleanmenu[Give the option to clean build PKGBUILDS]'
'--diffmenu[Give the option to show diffs for build files]'
'--editmenu[Give the option to edit/view PKGBUILDS]'
'--upgrademenu[Show a detailed list of updates with the option to skip any]'
"--nocleanmenu[Don't clean build PKGBUILDS]"
"--nodiffmenu[Don't show diffs for build files]"
"--noeditmenu[Don't edit/view PKGBUILDS]"
"--noupgrademenu[Don't show the upgrade menu]"
"--askremovemake[Ask to remove makedepends after install]"
"--askyesremovemake[Ask to remove makedepends after install(with "Y" as default)]"
"--removemake[Remove makedepends after install]"
"--noremovemake[Don't remove makedepends after install]"
@ -84,34 +80,26 @@ _pacman_opts_common=(
'--singlelineresults[List each search result on its own line]'
'--doublelineresults[List each search result on two lines, like pacman]'
'--devel[Check -git/-svn/-hg development version]'
'--nodevel[Disable development version checking]'
'--cleanafter[Clean package sources after successful build]'
'--nocleanafter[Disable package sources cleaning after successful build]'
'--keepsrc[Keep pkg/ and src/ after building packages]'
'--timeupdate[Check packages modification date and version]'
'--notimeupdate[Check only package version change]'
'--redownload[Always download pkgbuilds of targets]'
'--redownloadall[Always download pkgbuilds of all AUR packages]'
'--noredownload[Skip pkgbuild download if in cache and up to date]'
'--rebuild[Always build target packages]'
'--rebuildall[Always build all AUR packages]'
'--provides[Look for matching providers when searching for packages]'
'--noprovides[Just look for packages by pkgname]'
'--pgpfetch[Prompt to import PGP keys from PKGBUILDs]'
"--nopgpfetch[Don't prompt to import PGP keys]"
"--useask[Automatically resolve conflicts using pacman's ask flag]"
'--nouseask[Confirm conflicts manually during the install]'
'--combinedupgrade[Refresh then perform the repo and AUR upgrade together]'
'--nocombinedupgrade[Perform the repo upgrade and AUR upgrade separately]'
'--rebuildtree[Always build all AUR packages even if installed]'
'--norebuild[Skip package build if in cache and up to date]'
'--mflags[Pass arguments to makepkg]:mflags'
'--gpgflags[Pass arguments to gpg]:gpgflags'
'--sudoloop[Loop sudo calls in the background to avoid timeout]'
'--nosudoloop[Do not loop sudo calls in the background]'
'--searchby[Search for packages using a specified field]'
'--sortby[Sort AUR results by a specific field during search]'
'--batchinstall[Build multiple AUR packages then install them together]'
'--nobatchinstall[Build and install each AUR package one by one]'
)
# options for passing to _arguments: options for --upgrade commands
@ -512,18 +500,17 @@ _pacman_zsh_comp() {
"$_pacman_opts_query_modifiers[@]" \
'*:package file:_files -g "*.pkg.tar*~*.sig(.,@)"'
;;
T*)
_pacman_action_deptest
;;
Q*)
_pacman_action_query
;;
P*)
_arguments -s : \
'-P' \
"$_pacman_opts_print_modifiers[@]"
;;
W*)
_arguments -s : \
'-W' \
"$_pacman_opts_web_modifiers[@]"
;;
R*)
@ -556,10 +543,7 @@ _pacman_zsh_comp() {
_pacman_action_sync
;;
T*)
_arguments -s : \
'-T' \
"$_pacman_opts_common[@]" \
":packages:_pacman_all_packages"
_pacman_action_deptest
;;
U*)
_pacman_action_upgrade
@ -569,10 +553,12 @@ _pacman_zsh_comp() {
;;
Y*)
_arguments -s : \
'-Y' \
"$_pacman_opts_yay_modifiers[@]"
;;
G*)
_arguments -s : \
'-G' \
"$_pacman_opts_getpkgbuild_modifiers[@]"
;;

142
doc/yay.8
View File

@ -1,4 +1,4 @@
.TH "YAY" "8" "2019\-10\-21" "Yay v9.4+" "Yay Manual"
.TH "YAY" "8" "2019\-10\-21" "Yay v12.0+" "Yay Manual"
.nh
.ad l
.SH NAME
@ -19,10 +19,15 @@ This manpage only covers options unique to Yay. For other options see
\fBpacman(8)\fR.
.SH YAY OPERATIONS
.TP
.B \-Y, \-\-yay
Perform yay specific operations. This is the default if no other operation is
selected.
selected and targets are defined.
.TP
.B \-B, \-\-build
Build a PKGBUILD in a given directory.
.TP
.B \-P, \-\-show
@ -32,10 +37,14 @@ Perform yay specific print operations.
.B \-G, \-\-getpkgbuild
Downloads PKGBUILD from ABS or AUR. The ABS can only be used for Arch Linux repositories.
.RE
If no arguments are provided 'yay \-Syu' will be performed.
.TP
.B \-W, \-\-web
Web related operations such as voting for AUR packages.
If no operation is selected \-Y will be assumed.
.RE
If no operation is specified 'yay \-Syu' will be performed
If no operation is specified and targets are provided \-Y will be assumed
.SH EXTENDED PACMAN OPERATIONS
.TP
@ -54,7 +63,7 @@ Yay will also remove cached data about devel packages.
.SH NEW OPTIONS
.TP
.B \-\-repo
.B \-N, \-\-repo
Assume all targets are from the repositories. Additionally Actions such as
sysupgrade will only act on repository packages.
@ -73,6 +82,10 @@ packages.
Displays a list of packages matching the search terms and prompts the user on
which packages to install (yogurt mode).
The first search term is used to query the different sources and
the following search terms are used to narrow the search results
through exact matching.
.TP
.B \-\-gendb
Generate development package database. Tracks the latest commit for each
@ -84,16 +97,16 @@ used when migrating to Yay from another AUR helper.
.B \-c, \-\-clean
Remove unneeded dependencies.
.SH SHOW OPTIONS (APPLY TO \-P AND \-\-SHOW)
.TP
.B \-cc
Remove unneeded dependencies, including packages optionally required by any other package.
.SH SHOW OPTIONS (APPLY TO \-P AND \-\-show)
.TP
.B \-c, \-\-complete
Print a list of all AUR and repo packages. This allows shell completion
and is not intended to be used directly by the user.
.TP
.B \-f, \-\-fish
During complete adjust the output for the fish shell.
.TP
.B \-d, \-\-defaultconfig
Print default yay configuration.
@ -102,20 +115,12 @@ Print default yay configuration.
.B \-g, \-\-currentconfig
Print current yay configuration.
.TP
.B \-n, \-\-numberupgrades
Deprecated, use \fByay -Qu\fR and \fBwc -l\fR instead\%.
.TP
.B \-s, \-\-stats
Displays information about installed packages and system health. If there are
orphaned, or out\-of\-date packages, or packages that no longer exist on the
AUR; warnings will be displayed.
.TP
.B \-u, \-\-upgrades
Deprecated, use \fByay -Qu\fR instead\%.
.TP
.B \-w, \-\-news
Print new news from the Archlinux homepage. News is considered new if it is
@ -126,7 +131,12 @@ available news.
.B \-q, \-\-quiet
Only show titles when printing news.
.SH GETPKGBUILD OPTIONS (APPLY TO \-G AND \-\-GETPKGBUILD)
.SH BUILD OPTIONS (APPLY TO \-B AND \-\-build)
.TP
.B \-i, \-\-install
Build and install a PKGBUILD in a given directory
.SH GETPKGBUILD OPTIONS (APPLY TO \-G AND \-\-getpkgbuild)
.TP
.B \-f, \-\-force
Force download for ABS packages that already exist in the current directory. This
@ -136,7 +146,7 @@ ensures directories are not accidentally overwritten.
.B \-p, \-\-print
Prints the PKGBUILD of the given packages to stdout.
.SH WEB OPTIONS (APPLY TO \-W AND \-\-WEB)
.SH WEB OPTIONS (APPLY TO \-W AND \-\-web)
.TP
Web related operations such as voting for AUR packages.
@ -172,8 +182,8 @@ the AUR cache when deciding if Yay should skip builds.
.TP
.B \-\-editor <command>
Editor to use when editing PKGBUILDs. If this is not set the \fBEDITOR\fR
environment variable will be checked, followed by \fBVISUAL\fR. If none of
Editor to use when editing PKGBUILDs. If this is not set the \fBVISUAL\fR
environment variable will be checked, followed by \fBEDITOR\fR. If none of
these are set Yay will prompt the user for an editor.
.TP
@ -239,7 +249,7 @@ cache to never be refreshed.
Sort AUR results by a specific field during search.
.TP
.B \-\-searchby <name|name-desc|maintainer|depends|checkdepends|makedepends|optdepends>
.B \-\-searchby <name|name-desc|maintainer|depends|checkdepends|makedepends|optdepends|provides|conflicts|replaces|groups|keywords|comaintainers>
Search for AUR packages by querying the specified field.
.TP
@ -287,6 +297,9 @@ Unset the answer for the upgrade menu.
Show the clean menu. This menu gives you the chance to fully delete the
downloaded build files from Yay's cache before redownloading a fresh copy.
If 'cleanmenu' is enabled in the configuration file, you can temporarily disable it by
using '--cleanmenu=false' on the command line
.TP
.B \-\-diffmenu
Show the diff menu. This menu gives you the option to view diffs from
@ -304,37 +317,14 @@ before building.
\fBWarning\fR: Yay resolves dependencies ahead of time via the RPC. It is not
recommended to edit pkgbuild variables unless you know what you are doing.
.TP
.B \-\-upgrademenu
Show a detailed list of updates in a similar format to VerbosePkgLists.
Upgrades can also be skipped using numbers, number ranges or repo names.
Additionally ^ can be used to invert the selection.
\fBWarning\fR: It is not recommended to skip updates from the repositories as
this can lead to partial upgrades. This feature is intended to easily skip AUR
updates on the fly that may be broken or have a long compile time. Ultimately
it is up to the user what upgrades they skip.
.TP
.B \-\-nocleanmenu
Do not show the clean menu.
.TP
.B \-\-nodiffmenu
Do not show the diff menu.
.TP
.B \-\-noeditmenu
Do not show the edit menu.
.TP
.B \-\-noupgrademenu
Do not show the upgrade menu.
.TP
.B \-\-askremovemake
Ask to remove makedepends after installing packages.
.TP
.B \-\-askyesremovemake
Ask to remove makedepends after installing packages(with "Y" as default).
.TP
.B \-\-removemake
Remove makedepends after installing packages.
@ -373,9 +363,8 @@ checked almost instantly and not require the original pkgbuild to be downloaded.
The slower pacaur-like devel checks can be implemented manually by piping
a list of packages into yay (see \fBexamples\fR).
.TP
.B \-\-nodevel
Do not check for development packages updates during sysupgrade.
If 'devel' is enabled in the configuration file, you can temporarily disable it by
using '--devel=false' on the command line
.TP
.B \-\-cleanafter
@ -386,26 +375,18 @@ This allows VCS packages to easily pull an update
instead of having to reclone the entire repo.
.TP
.B \-\-nocleanafter
Do not remove package sources after successful Install.
.B \-\-keepsrc
Keep pkg/ and src/ after building packages
.TP
.B \-\-timeupdate
During sysupgrade also compare the build time of installed packages against
the last modification time of each package's AUR page.
.TP
.B \-\-notimeupdate
Do not consider build times during sysupgrade.
.TP
.B \-\-separatesources
Separate query results by source, AUR and sync
.TP
.B \-\-noseparatesources
Do not separate query results by source for searching
.TP
.B \-\-redownload
Always download pkgbuilds of targets even when a copy is available in cache.
@ -426,23 +407,11 @@ Look for matching providers when searching for AUR packages. When multiple
providers are found a menu will appear prompting you to pick one. This
increases dependency resolve time although this should not be noticeable.
.TP
.B \-\-noprovides
Do not look for matching providers when searching for AUR packages.
Yay will never show its provider menu but Pacman will still show its
provider menu for repo packages.
.TP
.B \-\-pgpfetch
Prompt to import unknown PGP keys from the \fBvalidpgpkeys\fR field of each
PKGBUILD.
.TP
.B \-\-nopgpfetch
Do not prompt to import unknown PGP keys. This is likely to cause a build
failure unless using options such as \fB\-\-skippgpcheck\fR or a customized
gpg config\%.
.TP
.B \-\-useask
Use pacman's --ask flag to automatically confirm package conflicts. Yay lists
@ -450,11 +419,6 @@ conflicts ahead of time. It is possible that Yay does not detect
a conflict, causing a package to be removed without the user's confirmation.
However, this is very unlikely.
.TP
.B \-\-nouseask
Manually resolve package conflicts during the install. Packages which do not
conflict will not need to be confined manually.
.TP
.B \-\-combinedupgrade
During sysupgrade, Yay will first perform a refresh, then show
@ -466,12 +430,6 @@ If Yay exits for any reason After the refresh without upgrading. It is then
the user's responsibility to either resolve the reason Yay exited or run
a sysupgrade through pacman directly.
.TP
.B \-\-nocombinedupgrade
During sysupgrade, Pacman \-Syu will be called, then the AUR upgrade will
start. This means the upgrade menu and pkgbuild review will be performed
after the sysupgrade has finished.
.TP
.B \-\-batchinstall
When building and installing AUR packages instead of installing each package
@ -479,10 +437,6 @@ after building, queue each package for install. Then once either all packages
are built or a package in the build queue is needed as a dependency to build
another package, install all the packages in the install queue.
.TP
.B \-\-nobatchinstall
Always install AUR packages immediately after building them.
.TP
.B \-\-rebuild
Always build target packages even when a copy is available in cache.
@ -536,10 +490,6 @@ separated list that is quoted by the shell.
Loop sudo calls in the background to prevent sudo from timing out during long
builds.
.TP
.B \-\-nosudoloop
Do not loop sudo calls in the background.
.SH EXAMPLES
.TP
yay \fIfoo\fR
@ -644,6 +594,6 @@ See the arch wiki at https://wiki.archlinux.org/index.php/Arch_User_Repository f
Please report bugs to our GitHub page https://github.com/Jguer/yay
.SH AUTHORS
Jguer <joaogg3@gmail.com>
Jguer <joguer@proton.me>
.br
Morgan <morganamilo@archlinux.org>

9
errors.go Normal file
View File

@ -0,0 +1,9 @@
package main
import (
"errors"
"github.com/leonelquinteros/gotext"
)
var ErrPackagesNotFound = errors.New(gotext.Get("could not find all required packages"))

41
get.go
View File

@ -7,27 +7,27 @@ import (
"os"
"strings"
"github.com/Jguer/aur"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/download"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/download"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
// yay -Gp.
func printPkgbuilds(dbExecutor download.DBSearcher, httpClient *http.Client, targets []string,
mode parser.TargetMode, aurURL string) error {
pkgbuilds, err := download.PKGBUILDs(dbExecutor, httpClient, targets, aurURL, mode)
func printPkgbuilds(dbExecutor download.DBSearcher, aurClient aur.QueryClient,
httpClient *http.Client, logger *text.Logger, targets []string,
mode parser.TargetMode, aurURL string,
) error {
pkgbuilds, err := download.PKGBUILDs(dbExecutor, aurClient, httpClient, logger, targets, aurURL, mode)
if err != nil {
text.Errorln(err)
logger.Errorln(err)
}
if len(pkgbuilds) != 0 {
for target, pkgbuild := range pkgbuilds {
fmt.Printf("\n\n# %s\n\n", target)
fmt.Print(string(pkgbuild))
}
for target, pkgbuild := range pkgbuilds {
logger.Printf("\n\n# %s\n\n%s", target, string(pkgbuild))
}
if len(pkgbuilds) != len(targets) {
@ -39,7 +39,7 @@ func printPkgbuilds(dbExecutor download.DBSearcher, httpClient *http.Client, tar
}
}
text.Warnln(gotext.Get("Unable to find the following packages:"), strings.Join(missing, ", "))
logger.Warnln(gotext.Get("Unable to find the following packages:"), " ", strings.Join(missing, ", "))
return fmt.Errorf("")
}
@ -48,17 +48,18 @@ func printPkgbuilds(dbExecutor download.DBSearcher, httpClient *http.Client, tar
}
// yay -G.
func getPkgbuilds(ctx context.Context, dbExecutor download.DBSearcher,
config *settings.Configuration, targets []string, force bool) error {
func getPkgbuilds(ctx context.Context, dbExecutor download.DBSearcher, aurClient aur.QueryClient,
run *runtime.Runtime, targets []string, force bool,
) error {
wd, err := os.Getwd()
if err != nil {
return err
}
cloned, errD := download.PKGBUILDRepos(ctx, dbExecutor,
config.Runtime.CmdBuilder, targets, config.Runtime.Mode, config.AURURL, wd, force)
cloned, errD := download.PKGBUILDRepos(ctx, dbExecutor, aurClient,
run.CmdBuilder, run.Logger, targets, run.Cfg.Mode, run.Cfg.AURURL, wd, force)
if errD != nil {
text.Errorln(errD)
run.Logger.Errorln(errD)
}
if len(targets) != len(cloned) {
@ -70,7 +71,7 @@ func getPkgbuilds(ctx context.Context, dbExecutor download.DBSearcher,
}
}
text.Warnln(gotext.Get("Unable to find the following packages:"), strings.Join(missing, ", "))
run.Logger.Warnln(gotext.Get("Unable to find the following packages:"), " ", strings.Join(missing, ", "))
err = fmt.Errorf("")
}

View File

@ -1,10 +0,0 @@
package main
import "github.com/Jguer/yay/v11/pkg/settings"
var (
yayVersion = "11.3.0" // To be set by compiler.
localePath = "/usr/share/locale" // To be set by compiler.
)
var config *settings.Configuration // YayConf holds the current config values for yay.

29
go.mod
View File

@ -1,26 +1,35 @@
module github.com/Jguer/yay/v11
module github.com/Jguer/yay/v12
require (
github.com/Jguer/aur v1.0.1
github.com/Jguer/go-alpm/v2 v2.1.2
github.com/Jguer/aur v1.2.3
github.com/Jguer/go-alpm/v2 v2.2.2
github.com/Jguer/votar v1.0.0
github.com/Morganamilo/go-pacmanconf v0.0.0-20210502114700-cff030e927a5
github.com/Morganamilo/go-srcinfo v1.0.0
github.com/adrg/strutil v0.3.1
github.com/bradleyjkemp/cupaloy v2.3.0+incompatible
github.com/leonelquinteros/gotext v1.5.0
github.com/stretchr/testify v1.8.0
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab
golang.org/x/term v0.0.0-20220722155259-a9ba230a4035
golang.org/x/text v0.3.7 // indirect
github.com/deckarep/golang-set/v2 v2.8.0
github.com/hashicorp/go-multierror v1.1.1
github.com/leonelquinteros/gotext v1.7.2
github.com/stretchr/testify v1.10.0
golang.org/x/net v0.41.0
golang.org/x/sys v0.33.0
golang.org/x/term v0.32.0
gopkg.in/h2non/gock.v1 v1.1.2
)
require (
github.com/adrg/strutil v0.3.0
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542 // indirect
github.com/hashicorp/errwrap v1.1.0 // indirect
github.com/itchyny/gojq v0.12.17 // indirect
github.com/itchyny/timefmt-go v0.1.6 // indirect
github.com/mitchellh/mapstructure v1.5.0 // indirect
github.com/ohler55/ojg v1.26.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)
go 1.17
go 1.23.5
toolchain go1.24.0

63
go.sum
View File

@ -1,15 +1,15 @@
github.com/Jguer/aur v1.0.1 h1:+GDOq0RuVn7CXpXzd8W85/+hPNDYonRZ3ONPm87e1jo=
github.com/Jguer/aur v1.0.1/go.mod h1:1/SQjhWahmk2xKcmAm6XO1zGqK8HgYw3xlJM6a7845E=
github.com/Jguer/go-alpm/v2 v2.1.2 h1:CGTIxzuEpT9Q3a7IBrx0E6acoYoaHX2Z93UOApPDhgU=
github.com/Jguer/go-alpm/v2 v2.1.2/go.mod h1:uLQcTMNM904dRiGU+/JDtDdd7Nd8mVbEVaHjhmziT7w=
github.com/Jguer/aur v1.2.3 h1:D+OGgLxnAnZnw88DsRvnRQsn0Poxsy9ng7pBcsA0krM=
github.com/Jguer/aur v1.2.3/go.mod h1:Dahvb6L1yr0rR7svyYSDwaRJoQMeyvJblwJ3QH/7CUs=
github.com/Jguer/go-alpm/v2 v2.2.2 h1:sPwUoZp1X5Tw6K6Ba1lWvVJfcgVNEGVcxARLBttZnC0=
github.com/Jguer/go-alpm/v2 v2.2.2/go.mod h1:lfe8gSe83F/KERaQvEfrSqQ4n+8bES+ZIyKWR/gm3MI=
github.com/Jguer/votar v1.0.0 h1:drPYpV5Py5BeAQS8xezmT6uCEfLzotNjLf5yfmlHKTg=
github.com/Jguer/votar v1.0.0/go.mod h1:rc6vgVlTqNjI4nAnPbDTbdxw/N7kXkbB8BcUDjeFbYQ=
github.com/Morganamilo/go-pacmanconf v0.0.0-20210502114700-cff030e927a5 h1:TMscPjkb1ThXN32LuFY5bEYIcXZx3YlwzhS1GxNpn/c=
github.com/Morganamilo/go-pacmanconf v0.0.0-20210502114700-cff030e927a5/go.mod h1:Hk55m330jNiwxRodIlMCvw5iEyoRUCIY64W1p9D+tHc=
github.com/Morganamilo/go-srcinfo v1.0.0 h1:Wh4nEF+HJWo+29hnxM18Q2hi+DUf0GejS13+Wg+dzmI=
github.com/Morganamilo/go-srcinfo v1.0.0/go.mod h1:MP6VGY1NNpVUmYIEgoM9acix95KQqIRyqQ0hCLsyYUY=
github.com/adrg/strutil v0.3.0 h1:bi/HB2zQbDihC8lxvATDTDzkT4bG7PATtVnDYp5rvq4=
github.com/adrg/strutil v0.3.0/go.mod h1:Jz0wzBVE6Uiy9wxo62YEqEY1Nwto3QlLl1Il5gkLKWU=
github.com/adrg/strutil v0.3.1 h1:OLvSS7CSJO8lBii4YmBt8jiK9QOtB9CzCzwl4Ic/Fz4=
github.com/adrg/strutil v0.3.1/go.mod h1:8h90y18QLrs11IBffcGX3NW/GFBXCMcNg4M7H6MspPA=
github.com/alexflint/go-arg v1.4.3/go.mod h1:3PZ/wp/8HuqRZMUUgu7I+e1qcpUbvmS258mRXkFH4IA=
github.com/alexflint/go-scalar v1.1.0/go.mod h1:LoFvNMqS1CPrMVltza4LvnGKhaSpc3oyLEBUZVhhS2o=
github.com/bradleyjkemp/cupaloy v2.3.0+incompatible h1:UafIjBvWQmS9i/xRg+CamMrnLTKNzo+bdmT/oH34c2Y=
@ -17,45 +17,52 @@ github.com/bradleyjkemp/cupaloy v2.3.0+incompatible/go.mod h1:Au1Xw1sgaJ5iSFktEh
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/deckarep/golang-set/v2 v2.8.0 h1:swm0rlPCmdWn9mESxKOjWk8hXSqoxOp+ZlfuyaAdFlQ=
github.com/deckarep/golang-set/v2 v2.8.0/go.mod h1:VAky9rY/yGXJOLEDv3OMci+7wtDpOF4IN+y82NBOac4=
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542 h1:2VTzZjLZBgl62/EtslCrtky5vbi9dd7HrQPQIx6wqiw=
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542/go.mod h1:Ow0tF8D4Kplbc8s8sSb3V2oUCygFHVp8gC3Dn6U4MNI=
github.com/leonelquinteros/gotext v1.5.0 h1:ODY7LzLpZWWSJdAHnzhreOr6cwLXTAmc914FOauSkBM=
github.com/leonelquinteros/gotext v1.5.0/go.mod h1:OCiUVHuhP9LGFBQ1oAmdtNCHJCiHiQA8lf4nAifHkr0=
github.com/hashicorp/errwrap v1.0.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
github.com/hashicorp/errwrap v1.1.0 h1:OxrOeh75EUXMY8TBjag2fzXGZ40LB6IKw45YeGUDY2I=
github.com/hashicorp/errwrap v1.1.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
github.com/hashicorp/go-multierror v1.1.1 h1:H5DkEtf6CXdFp0N0Em5UCwQpXMWke8IA0+lD48awMYo=
github.com/hashicorp/go-multierror v1.1.1/go.mod h1:iw975J/qwKPdAO1clOe2L8331t/9/fmwbPZ6JB6eMoM=
github.com/itchyny/gojq v0.12.17 h1:8av8eGduDb5+rvEdaOO+zQUjA04MS0m3Ps8HiD+fceg=
github.com/itchyny/gojq v0.12.17/go.mod h1:WBrEMkgAfAGO1LUcGOckBl5O726KPp+OlkKug0I/FEY=
github.com/itchyny/timefmt-go v0.1.6 h1:ia3s54iciXDdzWzwaVKXZPbiXzxxnv1SPGFfM/myJ5Q=
github.com/itchyny/timefmt-go v0.1.6/go.mod h1:RRDZYC5s9ErkjQvTvvU7keJjxUYzIISJGxm9/mAERQg=
github.com/leonelquinteros/gotext v1.7.2 h1:bDPndU8nt+/kRo1m4l/1OXiiy2v7Z7dfPQ9+YP7G1Mc=
github.com/leonelquinteros/gotext v1.7.2/go.mod h1:9/haCkm5P7Jay1sxKDGJ5WIg4zkz8oZKw4ekNpALob8=
github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=
github.com/mitchellh/mapstructure v1.5.0/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=
github.com/nbio/st v0.0.0-20140626010706-e9e8d9816f32 h1:W6apQkHrMkS0Muv8G/TipAy/FJl/rCYT0+EuS8+Z0z4=
github.com/nbio/st v0.0.0-20140626010706-e9e8d9816f32/go.mod h1:9wM+0iRr9ahx58uYLpLIr5fm8diHn0JbqRycJi6w0Ms=
github.com/ohler55/ojg v1.26.1 h1:J5TaLmVEuvnpVH7JMdT1QdbpJU545Yp6cKiCO4aQILc=
github.com/ohler55/ojg v1.26.1/go.mod h1:gQhDVpQLqrmnd2eqGAvJtn+NfKoYJbe/A4Sj3/Vro4o=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.4.0 h1:M2gUjqZET1qApGOWNSnZ49BAIMX4F/1plDv3+l31EJ4=
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
github.com/stretchr/objx v0.5.2 h1:xuMeJ0Sdp5ZMRXx/aWO6RZxdr3beISkG5/G/aIRr3pY=
github.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.7.2/go.mod h1:R6va5+xMeoiuVRoj+gSkQ7d3FALtqAAGI1FQKckRals=
github.com/stretchr/testify v1.8.0 h1:pSgiaMZlXftHpm5L7V1+rVB+AZJydKsMxsQBIJw4PKk=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/mod v0.1.1-0.20191105210325-c90efee705ee/go.mod h1:QqPTAvyqsEbceGzBzNggFXnrqF1CaUcvgkdR5Ot7KZg=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab h1:2QkjZIsXupsJbJIdSjjUOgWK3aEtzyuh2mPt3l/CkeU=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/term v0.0.0-20220722155259-a9ba230a4035 h1:Q5284mrmYTpACcm+eAKjKJH48BBwSyfJqmmGDTtT8Vc=
golang.org/x/term v0.0.0-20220722155259-a9ba230a4035/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.7 h1:olpwvP2KacW1ZWvsR7uQhoyTYvKAupfQrRGBFM352Gk=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/tools v0.0.0-20200221224223-e1da425f72fd/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=
golang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
golang.org/x/net v0.41.0 h1:vBTly1HeNPEn3wtREYfy4GZ/NECgw2Cnl+nK6Nz3uvw=
golang.org/x/net v0.41.0/go.mod h1:B/K4NNqkfmg07DQYrbwvSluqCJOOXwUjeb/5lOisjbA=
golang.org/x/sys v0.33.0 h1:q3i8TbbEz+JRD9ywIRlyRAQbM0qF7hu24q3teo2hbuw=
golang.org/x/sys v0.33.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/term v0.32.0 h1:DR4lr0TjUs3epypdhTOkMmuF5CDFJ/8pOnbzMZPQ7bg=
golang.org/x/term v0.32.0/go.mod h1:uZG1FhGx848Sqfsq4/DlJr3xGGsYMu/L5GW4abiaEPQ=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/h2non/gock.v1 v1.1.2 h1:jBbHXgGBK/AoPVfJh5x4r/WxIrElvbLel8TCZkkZJoY=
gopkg.in/h2non/gock.v1 v1.1.2/go.mod h1:n7UGz/ckNChHiK05rDoiC4MYSunEC/lyaUm2WWaDva0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.0-20210107192922-496545a6307b/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@ -1,855 +0,0 @@
package main
import (
"context"
"errors"
"fmt"
"os"
"path/filepath"
"strconv"
"strings"
"sync"
alpm "github.com/Jguer/go-alpm/v2"
gosrc "github.com/Morganamilo/go-srcinfo"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/completion"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/download"
"github.com/Jguer/yay/v11/pkg/menus"
"github.com/Jguer/yay/v11/pkg/pgp"
"github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
)
func asdeps(ctx context.Context, cmdArgs *parser.Arguments, pkgs []string) (err error) {
if len(pkgs) == 0 {
return nil
}
cmdArgs = cmdArgs.CopyGlobal()
_ = cmdArgs.AddArg("q", "D", "asdeps")
cmdArgs.AddTarget(pkgs...)
err = config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
if err != nil {
return errors.New(gotext.Get("error updating package install reason to dependency"))
}
return nil
}
func asexp(ctx context.Context, cmdArgs *parser.Arguments, pkgs []string) (err error) {
if len(pkgs) == 0 {
return nil
}
cmdArgs = cmdArgs.CopyGlobal()
_ = cmdArgs.AddArg("q", "D", "asexplicit")
cmdArgs.AddTarget(pkgs...)
err = config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
if err != nil {
return errors.New(gotext.Get("error updating package install reason to explicit"))
}
return nil
}
// Install handles package installs.
func install(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor, ignoreProviders bool) error {
var (
incompatible stringset.StringSet
do *dep.Order
srcinfos map[string]*gosrc.Srcinfo
noDeps = cmdArgs.ExistsDouble("d", "nodeps")
noCheck = strings.Contains(config.MFlags, "--nocheck")
assumeInstalled = cmdArgs.GetArgs("assume-installed")
sysupgradeArg = cmdArgs.ExistsArg("u", "sysupgrade")
refreshArg = cmdArgs.ExistsArg("y", "refresh")
warnings = query.NewWarnings()
)
if noDeps {
config.Runtime.CmdBuilder.AddMakepkgFlag("-d")
}
if config.Runtime.Mode.AtLeastRepo() {
if config.CombinedUpgrade {
if refreshArg {
if errR := earlyRefresh(ctx, cmdArgs); errR != nil {
return errors.New(gotext.Get("error refreshing databases"))
}
}
} else if refreshArg || sysupgradeArg || len(cmdArgs.Targets) > 0 {
if errP := earlyPacmanCall(ctx, cmdArgs, dbExecutor); errP != nil {
return errP
}
}
}
// we may have done -Sy, our handle now has an old
// database.
if errRefresh := dbExecutor.RefreshHandle(); errRefresh != nil {
return errRefresh
}
localNames, remoteNames, err := query.GetPackageNamesBySource(dbExecutor)
if err != nil {
return err
}
remoteNamesCache := stringset.FromSlice(remoteNames)
localNamesCache := stringset.FromSlice(localNames)
requestTargets := cmdArgs.Copy().Targets
// create the arguments to pass for the repo install
arguments := cmdArgs.Copy()
arguments.DelArg("asdeps", "asdep")
arguments.DelArg("asexplicit", "asexp")
arguments.Op = "S"
arguments.ClearTargets()
if config.Runtime.Mode == parser.ModeAUR {
arguments.DelArg("u", "sysupgrade")
}
// if we are doing -u also request all packages needing update
if sysupgradeArg {
ignore, targets, errUp := sysupgradeTargets(ctx, dbExecutor, cmdArgs.ExistsDouble("u", "sysupgrade"))
if errUp != nil {
return errUp
}
for _, up := range targets {
cmdArgs.AddTarget(up)
requestTargets = append(requestTargets, up)
}
if len(ignore) > 0 {
arguments.CreateOrAppendOption("ignore", ignore.ToSlice()...)
}
}
targets := stringset.FromSlice(cmdArgs.Targets)
dp, err := dep.GetPool(ctx, requestTargets,
warnings, dbExecutor, config.Runtime.AURClient, config.Runtime.Mode,
ignoreProviders, settings.NoConfirm, config.Provides, config.ReBuild, config.RequestSplitN, noDeps, noCheck, assumeInstalled)
if err != nil {
return err
}
if errC := dp.CheckMissing(noDeps, noCheck); errC != nil {
return errC
}
if len(dp.Aur) == 0 {
if !config.CombinedUpgrade {
if sysupgradeArg {
fmt.Println(gotext.Get(" there is nothing to do"))
}
return nil
}
cmdArgs.Op = "S"
cmdArgs.DelArg("y", "refresh")
if arguments.ExistsArg("ignore") {
cmdArgs.CreateOrAppendOption("ignore", arguments.GetArgs("ignore")...)
}
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
cmdArgs, config.Runtime.Mode, settings.NoConfirm))
}
conflicts, errCC := dp.CheckConflicts(config.UseAsk, settings.NoConfirm, noDeps)
if errCC != nil {
return errCC
}
do = dep.GetOrder(dp, noDeps, noCheck)
for _, pkg := range do.Repo {
arguments.AddTarget(pkg.DB().Name() + "/" + pkg.Name())
}
for _, pkg := range dp.Groups {
arguments.AddTarget(pkg)
}
if len(do.Aur) == 0 && len(arguments.Targets) == 0 &&
(!cmdArgs.ExistsArg("u", "sysupgrade") || config.Runtime.Mode == parser.ModeAUR) {
fmt.Println(gotext.Get(" there is nothing to do"))
return nil
}
do.Print()
fmt.Println()
if config.CleanAfter {
defer cleanAfter(ctx, do.Aur)
}
if do.HasMake() {
switch config.RemoveMake {
case "yes":
defer func() {
err = removeMake(ctx, do)
}()
case "no":
break
default:
if text.ContinueTask(os.Stdin, gotext.Get("Remove make dependencies after install?"), false, settings.NoConfirm) {
defer func() {
err = removeMake(ctx, do)
}()
}
}
}
if errCleanMenu := menus.Clean(config.CleanMenu,
config.BuildDir, do.Aur,
remoteNamesCache, settings.NoConfirm, config.AnswerClean); errCleanMenu != nil {
if errors.As(errCleanMenu, &settings.ErrUserAbort{}) {
return errCleanMenu
}
text.Errorln(errCleanMenu)
}
toSkip := pkgbuildsToSkip(do.Aur, targets)
toClone := make([]string, 0, len(do.Aur))
for _, base := range do.Aur {
if !toSkip.Get(base.Pkgbase()) {
toClone = append(toClone, base.Pkgbase())
}
}
if toSkipSlice := toSkip.ToSlice(); len(toSkipSlice) != 0 {
text.OperationInfoln(
gotext.Get("PKGBUILD up to date, Skipping (%d/%d): %s",
len(toSkipSlice), len(toClone), text.Cyan(strings.Join(toSkipSlice, ", "))))
}
cloned, errA := download.AURPKGBUILDRepos(ctx,
config.Runtime.CmdBuilder, toClone, config.AURURL, config.BuildDir, false)
if errA != nil {
return errA
}
if errDiffMenu := menus.Diff(ctx, config.Runtime.CmdBuilder, config.BuildDir,
config.DiffMenu, do.Aur, remoteNamesCache,
cloned, settings.NoConfirm, config.AnswerDiff); errDiffMenu != nil {
if errors.As(errDiffMenu, &settings.ErrUserAbort{}) {
return errDiffMenu
}
text.Errorln(errDiffMenu)
}
if errM := mergePkgbuilds(ctx, do.Aur); errM != nil {
return errM
}
srcinfos, err = parseSrcinfoFiles(do.Aur, true)
if err != nil {
return err
}
if errEditMenu := menus.Edit(config.EditMenu, config.BuildDir, do.Aur,
config.Editor, config.EditorFlags, remoteNamesCache, srcinfos,
settings.NoConfirm, config.AnswerEdit); errEditMenu != nil {
if errors.As(errEditMenu, &settings.ErrUserAbort{}) {
return errEditMenu
}
text.Errorln(errEditMenu)
}
incompatible, err = getIncompatible(do.Aur, srcinfos, dbExecutor)
if err != nil {
return err
}
if config.PGPFetch {
if errCPK := pgp.CheckPgpKeys(do.Aur, srcinfos, config.GpgBin, config.GpgFlags, settings.NoConfirm); errCPK != nil {
return errCPK
}
}
if !config.CombinedUpgrade {
arguments.DelArg("u", "sysupgrade")
}
if len(arguments.Targets) > 0 || arguments.ExistsArg("u") {
if errShow := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
arguments, config.Runtime.Mode, settings.NoConfirm)); errShow != nil {
return errors.New(gotext.Get("error installing repo packages"))
}
deps := make([]string, 0)
exp := make([]string, 0)
for _, pkg := range do.Repo {
if !dp.Explicit.Get(pkg.Name()) && !localNamesCache.Get(pkg.Name()) && !remoteNamesCache.Get(pkg.Name()) {
deps = append(deps, pkg.Name())
continue
}
if cmdArgs.ExistsArg("asdeps", "asdep") && dp.Explicit.Get(pkg.Name()) {
deps = append(deps, pkg.Name())
} else if cmdArgs.ExistsArg("asexp", "asexplicit") && dp.Explicit.Get(pkg.Name()) {
exp = append(exp, pkg.Name())
}
}
if errDeps := asdeps(ctx, cmdArgs, deps); errDeps != nil {
return errDeps
}
if errExp := asexp(ctx, cmdArgs, exp); errExp != nil {
return errExp
}
}
go func() {
_ = completion.Update(ctx, config.Runtime.HTTPClient, dbExecutor,
config.AURURL, config.Runtime.CompletionPath, config.CompletionInterval, false)
}()
if errP := downloadPKGBUILDSourceFanout(ctx, config.Runtime.CmdBuilder, config.BuildDir,
do.Aur, incompatible, config.MaxConcurrentDownloads); errP != nil {
text.Errorln(errP)
}
if errB := buildInstallPkgbuilds(ctx, cmdArgs, dbExecutor, dp, do, srcinfos, incompatible, conflicts, noDeps, noCheck); errB != nil {
return errB
}
return nil
}
func removeMake(ctx context.Context, do *dep.Order) error {
removeArguments := parser.MakeArguments()
err := removeArguments.AddArg("R", "u")
if err != nil {
return err
}
for _, pkg := range do.GetMake() {
removeArguments.AddTarget(pkg)
}
oldValue := settings.NoConfirm
settings.NoConfirm = true
err = config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
removeArguments, config.Runtime.Mode, settings.NoConfirm))
settings.NoConfirm = oldValue
return err
}
func inRepos(dbExecutor db.Executor, pkg string) bool {
target := dep.ToTarget(pkg)
if target.DB == "aur" {
return false
} else if target.DB != "" {
return true
}
previousHideMenus := settings.HideMenus
settings.HideMenus = true
exists := dbExecutor.SyncSatisfierExists(target.DepString())
settings.HideMenus = previousHideMenus
return exists || len(dbExecutor.PackagesFromGroup(target.Name)) > 0
}
func earlyPacmanCall(ctx context.Context, cmdArgs *parser.Arguments, dbExecutor db.Executor) error {
arguments := cmdArgs.Copy()
arguments.Op = "S"
targets := cmdArgs.Targets
cmdArgs.ClearTargets()
arguments.ClearTargets()
if config.Runtime.Mode == parser.ModeRepo {
arguments.Targets = targets
} else {
// separate aur and repo targets
for _, target := range targets {
if inRepos(dbExecutor, target) {
arguments.AddTarget(target)
} else {
cmdArgs.AddTarget(target)
}
}
}
if cmdArgs.ExistsArg("y", "refresh") || cmdArgs.ExistsArg("u", "sysupgrade") || len(arguments.Targets) > 0 {
if err := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
arguments, config.Runtime.Mode, settings.NoConfirm)); err != nil {
return errors.New(gotext.Get("error installing repo packages"))
}
}
return nil
}
func earlyRefresh(ctx context.Context, cmdArgs *parser.Arguments) error {
arguments := cmdArgs.Copy()
cmdArgs.DelArg("y", "refresh")
arguments.DelArg("u", "sysupgrade")
arguments.DelArg("s", "search")
arguments.DelArg("i", "info")
arguments.DelArg("l", "list")
arguments.ClearTargets()
return config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
arguments, config.Runtime.Mode, settings.NoConfirm))
}
func alpmArchIsSupported(alpmArch []string, arch string) bool {
if arch == "any" {
return true
}
for _, a := range alpmArch {
if a == arch {
return true
}
}
return false
}
func getIncompatible(bases []dep.Base, srcinfos map[string]*gosrc.Srcinfo, dbExecutor db.Executor) (stringset.StringSet, error) {
incompatible := make(stringset.StringSet)
basesMap := make(map[string]dep.Base)
alpmArch, err := dbExecutor.AlpmArchitectures()
if err != nil {
return nil, err
}
nextpkg:
for _, base := range bases {
for _, arch := range srcinfos[base.Pkgbase()].Arch {
if alpmArchIsSupported(alpmArch, arch) {
continue nextpkg
}
}
incompatible.Set(base.Pkgbase())
basesMap[base.Pkgbase()] = base
}
if len(incompatible) > 0 {
text.Warnln(gotext.Get("The following packages are not compatible with your architecture:"))
for pkg := range incompatible {
fmt.Print(" " + text.Cyan(basesMap[pkg].String()))
}
fmt.Println()
if !text.ContinueTask(os.Stdin, gotext.Get("Try to build them anyway?"), true, settings.NoConfirm) {
return nil, &settings.ErrUserAbort{}
}
}
return incompatible, nil
}
func parsePackageList(ctx context.Context, dir string) (pkgdests map[string]string, pkgVersion string, err error) {
stdout, stderr, err := config.Runtime.CmdBuilder.Capture(
config.Runtime.CmdBuilder.BuildMakepkgCmd(ctx, dir, "--packagelist"))
if err != nil {
return nil, "", fmt.Errorf("%s %s", stderr, err)
}
lines := strings.Split(stdout, "\n")
pkgdests = make(map[string]string)
for _, line := range lines {
if line == "" {
continue
}
fileName := filepath.Base(line)
split := strings.Split(fileName, "-")
if len(split) < 4 {
return nil, "", errors.New(gotext.Get("cannot find package name: %v", split))
}
// pkgname-pkgver-pkgrel-arch.pkgext
// This assumes 3 dashes after the pkgname, Will cause an error
// if the PKGEXT contains a dash. Please no one do that.
pkgName := strings.Join(split[:len(split)-3], "-")
pkgVersion = strings.Join(split[len(split)-3:len(split)-1], "-")
pkgdests[pkgName] = line
}
return pkgdests, pkgVersion, nil
}
func parseSrcinfoFiles(bases []dep.Base, errIsFatal bool) (map[string]*gosrc.Srcinfo, error) {
srcinfos := make(map[string]*gosrc.Srcinfo)
for k, base := range bases {
pkg := base.Pkgbase()
dir := filepath.Join(config.BuildDir, pkg)
text.OperationInfoln(gotext.Get("(%d/%d) Parsing SRCINFO: %s", k+1, len(bases), text.Cyan(base.String())))
pkgbuild, err := gosrc.ParseFile(filepath.Join(dir, ".SRCINFO"))
if err != nil {
if !errIsFatal {
text.Warnln(gotext.Get("failed to parse %s -- skipping: %s", base.String(), err))
continue
}
return nil, errors.New(gotext.Get("failed to parse %s: %s", base.String(), err))
}
srcinfos[pkg] = pkgbuild
}
return srcinfos, nil
}
func pkgbuildsToSkip(bases []dep.Base, targets stringset.StringSet) stringset.StringSet {
toSkip := make(stringset.StringSet)
for _, base := range bases {
isTarget := false
for _, pkg := range base {
isTarget = isTarget || targets.Get(pkg.Name)
}
if (config.ReDownload == "yes" && isTarget) || config.ReDownload == "all" {
continue
}
dir := filepath.Join(config.BuildDir, base.Pkgbase(), ".SRCINFO")
pkgbuild, err := gosrc.ParseFile(dir)
if err == nil {
if db.VerCmp(pkgbuild.Version(), base.Version()) >= 0 {
toSkip.Set(base.Pkgbase())
}
}
}
return toSkip
}
func gitMerge(ctx context.Context, path, name string) error {
_, stderr, err := config.Runtime.CmdBuilder.Capture(
config.Runtime.CmdBuilder.BuildGitCmd(ctx,
filepath.Join(path, name), "reset", "--hard", "HEAD"))
if err != nil {
return errors.New(gotext.Get("error resetting %s: %s", name, stderr))
}
_, stderr, err = config.Runtime.CmdBuilder.Capture(
config.Runtime.CmdBuilder.BuildGitCmd(ctx,
filepath.Join(path, name), "merge", "--no-edit", "--ff"))
if err != nil {
return errors.New(gotext.Get("error merging %s: %s", name, stderr))
}
return nil
}
func mergePkgbuilds(ctx context.Context, bases []dep.Base) error {
for _, base := range bases {
err := gitMerge(ctx, config.BuildDir, base.Pkgbase())
if err != nil {
return err
}
}
return nil
}
func buildInstallPkgbuilds(
ctx context.Context,
cmdArgs *parser.Arguments,
dbExecutor db.Executor,
dp *dep.Pool,
do *dep.Order,
srcinfos map[string]*gosrc.Srcinfo,
incompatible stringset.StringSet,
conflicts stringset.MapStringSet, noDeps, noCheck bool,
) error {
arguments := cmdArgs.Copy()
arguments.ClearTargets()
arguments.Op = "U"
arguments.DelArg("confirm")
arguments.DelArg("noconfirm")
arguments.DelArg("c", "clean")
arguments.DelArg("q", "quiet")
arguments.DelArg("q", "quiet")
arguments.DelArg("y", "refresh")
arguments.DelArg("u", "sysupgrade")
arguments.DelArg("w", "downloadonly")
deps := make([]string, 0)
exp := make([]string, 0)
oldConfirm := settings.NoConfirm
settings.NoConfirm = true
// remotenames: names of all non repo packages on the system
localNames, remoteNames, err := query.GetPackageNamesBySource(dbExecutor)
if err != nil {
return err
}
// cache as a stringset. maybe make it return a string set in the first
// place
remoteNamesCache := stringset.FromSlice(remoteNames)
localNamesCache := stringset.FromSlice(localNames)
for i, base := range do.Aur {
pkg := base.Pkgbase()
dir := filepath.Join(config.BuildDir, pkg)
built := true
satisfied := true
all:
for _, pkg := range base {
for _, dep := range dep.ComputeCombinedDepList(pkg, noDeps, noCheck) {
if !dp.AlpmExecutor.LocalSatisfierExists(dep) {
satisfied = false
text.Warnln(gotext.Get("%s not satisfied, flushing install queue", dep))
break all
}
}
}
if !satisfied || !config.BatchInstall {
err = doInstall(ctx, arguments, cmdArgs, deps, exp)
arguments.ClearTargets()
deps = make([]string, 0)
exp = make([]string, 0)
if err != nil {
if i != 0 {
go config.Runtime.VCSStore.RemovePackage([]string{do.Aur[i-1].String()})
}
return err
}
}
srcinfo := srcinfos[pkg]
args := []string{"--nobuild", "-fC"}
if incompatible.Get(pkg) {
args = append(args, "--ignorearch")
}
// pkgver bump
if err = config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildMakepkgCmd(ctx, dir, args...)); err != nil {
return errors.New(gotext.Get("error making: %s", base.String()))
}
pkgdests, pkgVersion, errList := parsePackageList(ctx, dir)
if errList != nil {
return errList
}
isExplicit := false
for _, b := range base {
isExplicit = isExplicit || dp.Explicit.Get(b.Name)
}
if config.ReBuild == "no" || (config.ReBuild == "yes" && !isExplicit) {
for _, split := range base {
pkgdest, ok := pkgdests[split.Name]
if !ok {
return errors.New(gotext.Get("could not find PKGDEST for: %s", split.Name))
}
if _, errStat := os.Stat(pkgdest); os.IsNotExist(errStat) {
built = false
} else if errStat != nil {
return errStat
}
}
} else {
built = false
}
if cmdArgs.ExistsArg("needed") {
installed := true
for _, split := range base {
installed = dp.AlpmExecutor.IsCorrectVersionInstalled(split.Name, pkgVersion)
}
if installed {
err = config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildMakepkgCmd(ctx,
dir, "-c", "--nobuild", "--noextract", "--ignorearch"))
if err != nil {
return errors.New(gotext.Get("error making: %s", err))
}
fmt.Fprintln(os.Stdout, gotext.Get("%s is up to date -- skipping", text.Cyan(pkg+"-"+pkgVersion)))
continue
}
}
if built {
err = config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildMakepkgCmd(ctx,
dir, "-c", "--nobuild", "--noextract", "--ignorearch"))
if err != nil {
return errors.New(gotext.Get("error making: %s", err))
}
text.Warnln(gotext.Get("%s already made -- skipping build", text.Cyan(pkg+"-"+pkgVersion)))
} else {
args := []string{"-cf", "--noconfirm", "--noextract", "--noprepare", "--holdver"}
if incompatible.Get(pkg) {
args = append(args, "--ignorearch")
}
if errMake := config.Runtime.CmdBuilder.Show(
config.Runtime.CmdBuilder.BuildMakepkgCmd(ctx,
dir, args...)); errMake != nil {
return errors.New(gotext.Get("error making: %s", base.String()))
}
}
// conflicts have been checked so answer y for them
if config.UseAsk && cmdArgs.ExistsArg("ask") {
ask, _ := strconv.Atoi(cmdArgs.Options["ask"].First())
uask := alpm.QuestionType(ask) | alpm.QuestionTypeConflictPkg
cmdArgs.Options["ask"].Set(fmt.Sprint(uask))
} else {
for _, split := range base {
if _, ok := conflicts[split.Name]; ok {
settings.NoConfirm = false
break
}
}
}
var errAdd error
for _, split := range base {
for suffix, optional := range map[string]bool{"": false, "-debug": true} {
deps, exp, errAdd = doAddTarget(dp, localNamesCache, remoteNamesCache,
arguments, cmdArgs, pkgdests, deps, exp, split.Name+suffix, optional)
if errAdd != nil {
return errAdd
}
}
}
var (
mux sync.Mutex
wg sync.WaitGroup
)
for _, pkg := range base {
wg.Add(1)
go config.Runtime.VCSStore.Update(ctx, pkg.Name, srcinfo.Source, &mux, &wg)
}
wg.Wait()
}
err = doInstall(ctx, arguments, cmdArgs, deps, exp)
if err != nil {
go config.Runtime.VCSStore.RemovePackage([]string{do.Aur[len(do.Aur)-1].String()})
}
settings.NoConfirm = oldConfirm
return err
}
func doInstall(ctx context.Context, arguments, cmdArgs *parser.Arguments, pkgDeps, pkgExp []string) error {
if len(arguments.Targets) == 0 {
return nil
}
if errShow := config.Runtime.CmdBuilder.Show(config.Runtime.CmdBuilder.BuildPacmanCmd(ctx,
arguments, config.Runtime.Mode, settings.NoConfirm)); errShow != nil {
return errShow
}
if errStore := config.Runtime.VCSStore.Save(); errStore != nil {
fmt.Fprintln(os.Stderr, errStore)
}
if errDeps := asdeps(ctx, cmdArgs, pkgDeps); errDeps != nil {
return errDeps
}
return asexp(ctx, cmdArgs, pkgExp)
}
func doAddTarget(dp *dep.Pool, localNamesCache, remoteNamesCache stringset.StringSet,
arguments, cmdArgs *parser.Arguments, pkgdests map[string]string,
deps, exp []string, name string, optional bool,
) (newDeps, newExp []string, err error) {
pkgdest, ok := pkgdests[name]
if !ok {
if optional {
return deps, exp, nil
}
return deps, exp, errors.New(gotext.Get("could not find PKGDEST for: %s", name))
}
if _, errStat := os.Stat(pkgdest); os.IsNotExist(errStat) {
if optional {
return deps, exp, nil
}
return deps, exp, errors.New(
gotext.Get(
"the PKGDEST for %s is listed by makepkg but does not exist: %s",
name, pkgdest))
}
arguments.AddTarget(pkgdest)
switch {
case cmdArgs.ExistsArg("asdeps", "asdep"):
deps = append(deps, name)
case cmdArgs.ExistsArg("asexplicit", "asexp"):
exp = append(exp, name)
case !dp.Explicit.Get(name) && !localNamesCache.Get(name) && !remoteNamesCache.Get(name):
deps = append(deps, name)
}
return deps, exp, nil
}

108
local_install.go Normal file
View File

@ -0,0 +1,108 @@
// Experimental code for install local with dependency refactoring
// Not at feature parity with install.go
package main
import (
"context"
"errors"
"fmt"
"os"
"path/filepath"
"strings"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/dep"
"github.com/Jguer/yay/v12/pkg/multierror"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/sync"
gosrc "github.com/Morganamilo/go-srcinfo"
"github.com/leonelquinteros/gotext"
)
var ErrNoBuildFiles = errors.New(gotext.Get("cannot find PKGBUILD and .SRCINFO in directory"))
func srcinfoExists(ctx context.Context,
cmdBuilder exe.ICmdBuilder, targetDir string,
) error {
srcInfoDir := filepath.Join(targetDir, ".SRCINFO")
pkgbuildDir := filepath.Join(targetDir, "PKGBUILD")
if _, err := os.Stat(srcInfoDir); err == nil {
if _, err := os.Stat(pkgbuildDir); err == nil {
return nil
}
}
if _, err := os.Stat(pkgbuildDir); err == nil {
// run makepkg to generate .SRCINFO
srcinfo, stderr, err := cmdBuilder.Capture(cmdBuilder.BuildMakepkgCmd(ctx, targetDir, "--printsrcinfo"))
if err != nil {
return fmt.Errorf("unable to generate .SRCINFO: %w - %s", err, stderr)
}
if srcinfo == "" {
return fmt.Errorf("generated .SRCINFO is empty, check your PKGBUILD for errors")
}
if err := os.WriteFile(srcInfoDir, []byte(srcinfo), 0o600); err != nil {
return fmt.Errorf("unable to write .SRCINFO: %w", err)
}
return nil
}
return fmt.Errorf("%w: %s", ErrNoBuildFiles, targetDir)
}
func installLocalPKGBUILD(
ctx context.Context,
run *runtime.Runtime,
cmdArgs *parser.Arguments,
dbExecutor db.Executor,
) error {
aurCache := run.AURClient
noCheck := strings.Contains(run.Cfg.MFlags, "--nocheck")
if len(cmdArgs.Targets) < 1 {
return errors.New(gotext.Get("no target directories specified"))
}
srcInfos := map[string]*gosrc.Srcinfo{}
for _, targetDir := range cmdArgs.Targets {
if err := srcinfoExists(ctx, run.CmdBuilder, targetDir); err != nil {
return err
}
pkgbuild, err := gosrc.ParseFile(filepath.Join(targetDir, ".SRCINFO"))
if err != nil {
return fmt.Errorf("%s: %w", gotext.Get("failed to parse .SRCINFO"), err)
}
srcInfos[targetDir] = pkgbuild
}
grapher := dep.NewGrapher(dbExecutor, aurCache, false, settings.NoConfirm,
cmdArgs.ExistsDouble("d", "nodeps"), noCheck, cmdArgs.ExistsArg("needed"),
run.Logger.Child("grapher"))
graph, err := grapher.GraphFromSrcInfos(ctx, nil, srcInfos)
if err != nil {
return err
}
opService := sync.NewOperationService(ctx, dbExecutor, run)
multiErr := &multierror.MultiError{}
targets := graph.TopoSortedLayerMap(func(name string, ii *dep.InstallInfo) error {
if ii.Source == dep.Missing {
multiErr.Add(fmt.Errorf("%w: %s %s", ErrPackagesNotFound, name, ii.Version))
}
return nil
})
if err := multiErr.Return(); err != nil {
return err
}
return opService.Run(ctx, run, cmdArgs, targets, []string{})
}

1019
local_install_test.go Normal file

File diff suppressed because it is too large Load Diff

147
main.go
View File

@ -2,21 +2,24 @@ package main // import "github.com/Jguer/yay"
import (
"context"
"fmt"
"errors"
"os"
"os/exec"
"runtime/debug"
"strings"
pacmanconf "github.com/Morganamilo/go-pacmanconf"
"github.com/leonelquinteros/gotext"
"golang.org/x/term"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/db/ialpm"
"github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/db/ialpm"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
var (
yayVersion = "12.0.4" // To be set by compiler.
localePath = "/usr/share/locale" // To be set by compiler.
)
func initGotext() {
@ -25,7 +28,12 @@ func initGotext() {
}
if lc := os.Getenv("LANGUAGE"); lc != "" {
gotext.Configure(localePath, lc, "yay")
// Split LANGUAGE by ':' and prioritize the first locale
// Should fix in gotext to support this
locales := strings.Split(lc, ":")
if len(locales) > 0 && locales[0] != "" {
gotext.Configure(localePath, locales[0], "yay")
}
} else if lc := os.Getenv("LC_ALL"); lc != "" {
gotext.Configure(localePath, lc, "yay")
} else if lc := os.Getenv("LC_MESSAGES"); lc != "" {
@ -35,61 +43,8 @@ func initGotext() {
}
}
func initAlpm(cmdArgs *parser.Arguments, pacmanConfigPath string) (*pacmanconf.Config, bool, error) {
root := "/"
if value, _, exists := cmdArgs.GetArg("root", "r"); exists {
root = value
}
pacmanConf, stderr, err := pacmanconf.PacmanConf("--config", pacmanConfigPath, "--root", root)
if err != nil {
cmdErr := err
if stderr != "" {
cmdErr = fmt.Errorf("%s\n%s", err, stderr)
}
return nil, false, cmdErr
}
if dbPath, _, exists := cmdArgs.GetArg("dbpath", "b"); exists {
pacmanConf.DBPath = dbPath
}
if arch := cmdArgs.GetArgs("arch"); arch != nil {
pacmanConf.Architecture = append(pacmanConf.Architecture, arch...)
}
if ignoreArray := cmdArgs.GetArgs("ignore"); ignoreArray != nil {
pacmanConf.IgnorePkg = append(pacmanConf.IgnorePkg, ignoreArray...)
}
if ignoreGroupsArray := cmdArgs.GetArgs("ignoregroup"); ignoreGroupsArray != nil {
pacmanConf.IgnoreGroup = append(pacmanConf.IgnoreGroup, ignoreGroupsArray...)
}
if cacheArray := cmdArgs.GetArgs("cachedir"); cacheArray != nil {
pacmanConf.CacheDir = cacheArray
}
if gpgDir, _, exists := cmdArgs.GetArg("gpgdir"); exists {
pacmanConf.GPGDir = gpgDir
}
useColor := pacmanConf.Color && term.IsTerminal(int(os.Stdout.Fd()))
switch value, _, _ := cmdArgs.GetArg("color"); value {
case "always":
useColor = true
case "auto":
useColor = term.IsTerminal(int(os.Stdout.Fd()))
case "never":
useColor = false
}
return pacmanConf, useColor, nil
}
func main() {
fallbackLog := text.NewLogger(os.Stdout, os.Stderr, os.Stdin, false, "fallback")
var (
err error
ctx = context.Background()
@ -98,8 +53,9 @@ func main() {
defer func() {
if rec := recover(); rec != nil {
text.Errorln(rec)
debug.PrintStack()
fallbackLog.Errorln("Panic occurred:", rec)
fallbackLog.Errorln("Stack trace:", string(debug.Stack()))
ret = 1
}
os.Exit(ret)
@ -108,13 +64,15 @@ func main() {
initGotext()
if os.Geteuid() == 0 {
text.Warnln(gotext.Get("Avoid running yay as root/sudo."))
fallbackLog.Warnln(gotext.Get("Avoid running yay as root/sudo."))
}
config, err = settings.NewConfig(yayVersion)
configPath := settings.GetConfigPath()
// Parse config
cfg, err := settings.NewConfig(fallbackLog, configPath, yayVersion)
if err != nil {
if str := err.Error(); str != "" {
text.Errorln(str)
fallbackLog.Errorln(str)
}
ret = 1
@ -122,16 +80,17 @@ func main() {
return
}
if errS := config.RunMigrations(
settings.DefaultMigrations(), config.Runtime.ConfigPath); errS != nil {
text.Errorln(errS)
if errS := cfg.RunMigrations(fallbackLog,
settings.DefaultMigrations(), configPath, yayVersion); errS != nil {
fallbackLog.Errorln(errS)
}
cmdArgs := parser.MakeArguments()
if err = config.ParseCommandLine(cmdArgs); err != nil {
// Parse command line
if err = cfg.ParseCommandLine(cmdArgs); err != nil {
if str := err.Error(); str != "" {
text.Errorln(str)
fallbackLog.Errorln(str)
}
ret = 1
@ -139,26 +98,17 @@ func main() {
return
}
if config.Runtime.SaveConfig {
if errS := config.Save(config.Runtime.ConfigPath); errS != nil {
text.Errorln(errS)
if cfg.SaveConfig {
if errS := cfg.Save(configPath, yayVersion); errS != nil {
fallbackLog.Errorln(errS)
}
}
if config.SeparateSources {
config.Runtime.QueryBuilder = query.NewSourceQueryBuilder(config.SortBy,
config.Runtime.Mode, config.SearchBy, config.BottomUp, config.SingleLineResults)
} else {
config.Runtime.QueryBuilder = query.NewMixedSourceQueryBuilder(config.SortBy,
config.Runtime.Mode, config.SearchBy, config.BottomUp, config.SingleLineResults)
}
var useColor bool
config.Runtime.PacmanConf, useColor, err = initAlpm(cmdArgs, config.PacmanConf)
// Build run
run, err := runtime.NewRuntime(cfg, cmdArgs, yayVersion)
if err != nil {
if str := err.Error(); str != "" {
text.Errorln(str)
fallbackLog.Errorln(str)
}
ret = 1
@ -166,14 +116,10 @@ func main() {
return
}
config.Runtime.CmdBuilder.SetPacmanDBPath(config.Runtime.PacmanConf.DBPath)
text.UseColor = useColor
dbExecutor, err := ialpm.NewExecutor(config.Runtime.PacmanConf)
dbExecutor, err := ialpm.NewExecutor(run.PacmanConf, run.Logger.Child("db"))
if err != nil {
if str := err.Error(); str != "" {
text.Errorln(str)
fallbackLog.Errorln(str)
}
ret = 1
@ -183,19 +129,20 @@ func main() {
defer func() {
if rec := recover(); rec != nil {
text.Errorln(rec)
debug.PrintStack()
fallbackLog.Errorln("Panic occurred in DB operation:", rec)
fallbackLog.Errorln("Stack trace:", string(debug.Stack()))
}
dbExecutor.Cleanup()
}()
if err = handleCmd(ctx, cmdArgs, db.Executor(dbExecutor)); err != nil {
if err = handleCmd(ctx, run, cmdArgs, dbExecutor); err != nil {
if str := err.Error(); str != "" {
text.Errorln(str)
fallbackLog.Errorln(str)
}
if exitError, ok := err.(*exec.ExitError); ok {
exitError := &exec.ExitError{}
if errors.As(err, &exitError) {
// mirror pacman exit code when applicable
ret = exitError.ExitCode()
return

View File

@ -1,51 +0,0 @@
package main
import (
"testing"
"github.com/Morganamilo/go-pacmanconf"
"github.com/stretchr/testify/assert"
"github.com/Jguer/yay/v11/pkg/settings/parser"
)
func TestPacmanConf(t *testing.T) {
t.Parallel()
expectedPacmanConf := &pacmanconf.Config{
RootDir: "/",
DBPath: "//var/lib/pacman/",
CacheDir: []string{"/cachedir/", "/another/"},
HookDir: []string{"/hookdir/"},
GPGDir: "/gpgdir/",
LogFile: "/logfile",
HoldPkg: []string(nil),
IgnorePkg: []string{"ignore", "this", "package"},
IgnoreGroup: []string{"ignore", "this", "group"},
Architecture: []string{"8086"},
XferCommand: "",
NoUpgrade: []string{"noupgrade"},
NoExtract: []string{"noextract"},
CleanMethod: []string{"KeepInstalled"},
SigLevel: []string{"PackageOptional", "PackageTrustedOnly", "DatabaseOptional", "DatabaseTrustedOnly"},
LocalFileSigLevel: []string(nil),
RemoteFileSigLevel: []string(nil),
UseSyslog: false,
Color: false,
UseDelta: 0,
TotalDownload: false,
CheckSpace: true,
VerbosePkgLists: true,
DisableDownloadTimeout: false,
Repos: []pacmanconf.Repository{
{Name: "repo1", Servers: []string{"repo1"}, SigLevel: []string(nil), Usage: []string{"All"}},
{Name: "repo2", Servers: []string{"repo2"}, SigLevel: []string(nil), Usage: []string{"All"}},
},
}
pacmanConf, color, err := initAlpm(parser.MakeArguments(), "testdata/pacman.conf")
assert.Nil(t, err)
assert.NotNil(t, pacmanConf)
assert.Equal(t, color, false)
assert.EqualValues(t, expectedPacmanConf, pacmanConf)
}

82
pkg/cmd/graph/main.go Normal file
View File

@ -0,0 +1,82 @@
package main
import (
"context"
"errors"
"fmt"
"os"
"path/filepath"
"github.com/Jguer/yay/v12/pkg/db/ialpm"
"github.com/Jguer/yay/v12/pkg/dep"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/aur/metadata"
"github.com/leonelquinteros/gotext"
)
func handleCmd(logger *text.Logger) error {
cfg, err := settings.NewConfig(logger, settings.GetConfigPath(), "")
if err != nil {
return err
}
cmdArgs := parser.MakeArguments()
if errP := cfg.ParseCommandLine(cmdArgs); errP != nil {
return errP
}
run, err := runtime.NewRuntime(cfg, cmdArgs, "1.0.0")
if err != nil {
return err
}
dbExecutor, err := ialpm.NewExecutor(run.PacmanConf, logger)
if err != nil {
return err
}
aurCache, err := metadata.New(
metadata.WithCacheFilePath(
filepath.Join(cfg.BuildDir, "aur.json")))
if err != nil {
return fmt.Errorf("%s: %w", gotext.Get("failed to retrieve aur Cache"), err)
}
grapher := dep.NewGrapher(dbExecutor, aurCache, true, settings.NoConfirm,
cmdArgs.ExistsDouble("d", "nodeps"), false, false,
run.Logger.Child("grapher"))
return graphPackage(context.Background(), grapher, cmdArgs.Targets)
}
func main() {
fallbackLog := text.NewLogger(os.Stdout, os.Stderr, os.Stdin, false, "fallback")
if err := handleCmd(fallbackLog); err != nil {
fallbackLog.Errorln(err)
os.Exit(1)
}
}
func graphPackage(
ctx context.Context,
grapher *dep.Grapher,
targets []string,
) error {
if len(targets) != 1 {
return errors.New(gotext.Get("only one target is allowed"))
}
graph, err := grapher.GraphFromAUR(ctx, nil, []string{targets[0]})
if err != nil {
return err
}
fmt.Fprintln(os.Stdout, graph.String())
fmt.Fprintln(os.Stdout, "\nlayers map\n", graph.TopoSortedLayerMap(nil))
return nil
}

View File

@ -13,7 +13,7 @@ import (
"strings"
"time"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v12/pkg/db"
)
type PkgSynchronizer interface {
@ -117,7 +117,7 @@ func createAURList(ctx context.Context, client httpRequestDoer, aurURL string, o
return nil
}
// CreatePackageList appends Repo packages to completion cache.
// createRepoList appends Repo packages to completion cache.
func createRepoList(dbExecutor PkgSynchronizer, out io.Writer) error {
for _, pkg := range dbExecutor.SyncPackages() {
_, err := io.WriteString(out, pkg.Name()+"\t"+pkg.DB().Name()+"\n")

View File

@ -1,3 +1,6 @@
//go:build !integration
// +build !integration
package completion
import (
@ -59,7 +62,7 @@ func Test_createAURList(t *testing.T) {
returnErr: nil,
}
out := &bytes.Buffer{}
err := createAURList(context.TODO(), doer, "https://aur.archlinux.org", out)
err := createAURList(context.Background(), doer, "https://aur.archlinux.org", out)
assert.NoError(t, err)
gotOut := out.String()
assert.Equal(t, expectPackageCompletion, gotOut)
@ -76,7 +79,7 @@ func Test_createAURListHTTPError(t *testing.T) {
}
out := &bytes.Buffer{}
err := createAURList(context.TODO(), doer, "https://aur.archlinux.org", out)
err := createAURList(context.Background(), doer, "https://aur.archlinux.org", out)
assert.EqualError(t, err, "Not available")
}
@ -91,6 +94,6 @@ func Test_createAURListStatusError(t *testing.T) {
}
out := &bytes.Buffer{}
err := createAURList(context.TODO(), doer, "https://aur.archlinux.org", out)
err := createAURList(context.Background(), doer, "https://aur.archlinux.org", out)
assert.EqualError(t, err, "invalid status code: 503")
}

View File

@ -4,6 +4,8 @@ import (
"time"
alpm "github.com/Jguer/go-alpm/v2"
"github.com/Jguer/yay/v12/pkg/text"
)
type (
@ -19,33 +21,48 @@ func VerCmp(v1, v2 string) int {
type Upgrade struct {
Name string
Base string
Repository string
LocalVersion string
RemoteVersion string
Reason alpm.PkgReason
Extra string // Extra information to be displayed
}
type SyncUpgrade struct {
Package alpm.IPackage
LocalVersion string
Reason alpm.PkgReason
}
type Executor interface {
AlpmArchitectures() ([]string, error)
BiggestPackages() []IPackage
Cleanup()
InstalledRemotePackageNames() []string
InstalledRemotePackages() map[string]IPackage
InstalledSyncPackageNames() []string
IsCorrectVersionInstalled(string, string) bool
LastBuildTime() time.Time
LocalPackage(string) IPackage
LocalPackages() []IPackage
LocalSatisfierExists(string) bool
PackageConflicts(IPackage) []Depend
PackageDepends(IPackage) []Depend
PackageGroups(IPackage) []string
PackageOptionalDepends(IPackage) []Depend
PackageProvides(IPackage) []Depend
PackagesFromGroup(string) []IPackage
PackagesFromGroupAndDB(string, string) ([]IPackage, error)
RefreshHandle() error
RepoUpgrades(bool) ([]Upgrade, error)
SyncUpgrades(enableDowngrade bool) (
map[string]SyncUpgrade, error)
Repos() []string
SatisfierFromDB(string, string) IPackage
SatisfierFromDB(string, string) (IPackage, error)
SyncPackage(string) IPackage
SyncPackageFromDB(string, string) IPackage
SyncPackages(...string) []IPackage
SyncSatisfier(string) IPackage
SyncSatisfierExists(string) bool
SetLogger(logger *text.Logger)
}

View File

@ -11,10 +11,9 @@ import (
pacmanconf "github.com/Morganamilo/go-pacmanconf"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v11/pkg/upgrade"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/text"
)
type AlpmExecutor struct {
@ -23,16 +22,31 @@ type AlpmExecutor struct {
syncDB alpm.IDBList
syncDBsCache []alpm.IDB
conf *pacmanconf.Config
log *text.Logger
installedRemotePkgNames []string
installedRemotePkgMap map[string]alpm.IPackage
installedSyncPkgNames []string
}
func NewExecutor(pacmanConf *pacmanconf.Config) (*AlpmExecutor, error) {
ae := &AlpmExecutor{conf: pacmanConf}
func NewExecutor(pacmanConf *pacmanconf.Config, logger *text.Logger) (*AlpmExecutor, error) {
ae := &AlpmExecutor{
handle: nil,
localDB: nil,
syncDB: nil,
syncDBsCache: []alpm.IDB{},
conf: pacmanConf,
log: logger,
installedRemotePkgNames: nil,
installedRemotePkgMap: nil,
installedSyncPkgNames: nil,
}
err := ae.RefreshHandle()
if err != nil {
if err := ae.RefreshHandle(); err != nil {
return nil, err
}
var err error
ae.localDB, err = ae.handle.LocalDB()
if err != nil {
return nil, err
@ -129,12 +143,14 @@ func configureAlpm(pacmanConf *pacmanconf.Config, alpmHandle *alpm.Handle) error
return alpmHandle.SetCheckSpace(pacmanConf.CheckSpace)
}
func logCallback(level alpm.LogLevel, str string) {
switch level {
case alpm.LogWarning:
text.Warn(str)
case alpm.LogError:
text.Error(str)
func (ae *AlpmExecutor) logCallback() func(level alpm.LogLevel, str string) {
return func(level alpm.LogLevel, str string) {
switch level {
case alpm.LogWarning:
ae.log.Warn(str)
case alpm.LogError:
ae.log.Error(str)
}
}
}
@ -160,7 +176,7 @@ func (ae *AlpmExecutor) questionCallback() func(question alpm.QuestionAny) {
return nil
})
str := text.Bold(gotext.Get("There are %d providers available for %s:", size, qp.Dep()))
str := text.Bold(gotext.Get("There are %[1]d providers available for %[2]s:", size, qp.Dep()))
size = 1
@ -172,28 +188,28 @@ func (ae *AlpmExecutor) questionCallback() func(question alpm.QuestionAny) {
if dbName != thisDB {
dbName = thisDB
str += "\n"
str += text.SprintOperationInfo(gotext.Get("Repository"), " ", dbName, "\n ")
str += ae.log.SprintOperationInfo(gotext.Get("Repository"), " ", dbName, "\n ")
}
str += fmt.Sprintf("%d) %s ", size, pkg.Name())
size++
return nil
})
text.OperationInfoln(str)
ae.log.OperationInfoln(str)
for {
fmt.Println(gotext.Get("\nEnter a number (default=1): "))
ae.log.Println(gotext.Get("\nEnter a number (default=1): "))
// TODO: reenable noconfirm
if settings.NoConfirm {
fmt.Println()
ae.log.Println()
break
}
numberBuf, err := text.GetInput("", false)
numberBuf, err := ae.log.GetInput("", false)
if err != nil {
text.Errorln(err)
ae.log.Errorln(err)
break
}
@ -203,12 +219,12 @@ func (ae *AlpmExecutor) questionCallback() func(question alpm.QuestionAny) {
num, err := strconv.Atoi(numberBuf)
if err != nil {
text.Errorln(gotext.Get("invalid number: %s", numberBuf))
ae.log.Errorln(gotext.Get("invalid number: %s", numberBuf))
continue
}
if num < 1 || num > size {
text.Errorln(gotext.Get("invalid value: %d is not between %d and %d", num, 1, size))
ae.log.Errorln(gotext.Get("invalid value: %d is not between %d and %d", num, 1, size))
continue
}
@ -236,7 +252,7 @@ func (ae *AlpmExecutor) RefreshHandle() error {
}
alpmSetQuestionCallback(alpmHandle, ae.questionCallback())
alpmSetLogCallback(alpmHandle, logCallback)
alpmSetLogCallback(alpmHandle, ae.logCallback())
ae.handle = alpmHandle
ae.syncDBsCache = nil
@ -295,6 +311,22 @@ func (ae *AlpmExecutor) PackagesFromGroup(groupName string) []alpm.IPackage {
return groupPackages
}
func (ae *AlpmExecutor) PackagesFromGroupAndDB(groupName, dbName string) ([]alpm.IPackage, error) {
singleDBList, err := ae.handle.SyncDBListByDBName(dbName)
if err != nil {
return nil, err
}
groupPackages := []alpm.IPackage{}
_ = singleDBList.FindGroupPkgs(groupName).ForEach(func(pkg alpm.IPackage) error {
groupPackages = append(groupPackages, pkg)
return nil
})
return groupPackages, nil
}
func (ae *AlpmExecutor) LocalPackages() []alpm.IPackage {
localPackages := []alpm.IPackage{}
_ = ae.localDB.PkgCache().ForEach(func(pkg alpm.IPackage) error {
@ -353,18 +385,27 @@ func (ae *AlpmExecutor) SyncPackage(pkgName string) alpm.IPackage {
return nil
}
func (ae *AlpmExecutor) SatisfierFromDB(pkgName, dbName string) alpm.IPackage {
func (ae *AlpmExecutor) SyncPackageFromDB(pkgName, dbName string) alpm.IPackage {
singleDB, err := ae.handle.SyncDBByName(dbName)
if err != nil {
return nil
}
foundPkg, err := singleDB.PkgCache().FindSatisfier(pkgName)
return singleDB.Pkg(pkgName)
}
func (ae *AlpmExecutor) SatisfierFromDB(pkgName, dbName string) (alpm.IPackage, error) {
singleDBList, err := ae.handle.SyncDBListByDBName(dbName)
if err != nil {
return nil
return nil, err
}
return foundPkg
foundPkg, err := singleDBList.FindSatisfier(pkgName)
if err != nil {
return nil, nil
}
return foundPkg, nil
}
func (ae *AlpmExecutor) PackageDepends(pkg alpm.IPackage) []alpm.Depend {
@ -382,11 +423,6 @@ func (ae *AlpmExecutor) PackageProvides(pkg alpm.IPackage) []alpm.Depend {
return alpmPackage.Provides().Slice()
}
func (ae *AlpmExecutor) PackageConflicts(pkg alpm.IPackage) []alpm.Depend {
alpmPackage := pkg.(*alpm.Package)
return alpmPackage.Conflicts().Slice()
}
func (ae *AlpmExecutor) PackageGroups(pkg alpm.IPackage) []string {
alpmPackage := pkg.(*alpm.Package)
return alpmPackage.Groups().Slice()
@ -394,18 +430,19 @@ func (ae *AlpmExecutor) PackageGroups(pkg alpm.IPackage) []string {
// upRepo gathers local packages and checks if they have new versions.
// Output: Upgrade type package list.
func (ae *AlpmExecutor) RepoUpgrades(enableDowngrade bool) ([]db.Upgrade, error) {
func (ae *AlpmExecutor) SyncUpgrades(enableDowngrade bool) (
map[string]db.SyncUpgrade, error,
) {
ups := map[string]db.SyncUpgrade{}
var errReturn error
slice := []db.Upgrade{}
localDB, errDB := ae.handle.LocalDB()
if errDB != nil {
return slice, errDB
return ups, errDB
}
if err := ae.handle.TransInit(alpm.TransFlagNoLock); err != nil {
return slice, err
return ups, err
}
defer func() {
@ -413,7 +450,7 @@ func (ae *AlpmExecutor) RepoUpgrades(enableDowngrade bool) ([]db.Upgrade, error)
}()
if err := ae.handle.SyncSysupgrade(enableDowngrade); err != nil {
return slice, err
return ups, err
}
_ = ae.handle.TransGetAdd().ForEach(func(pkg alpm.IPackage) error {
@ -425,17 +462,16 @@ func (ae *AlpmExecutor) RepoUpgrades(enableDowngrade bool) ([]db.Upgrade, error)
reason = localPkg.Reason()
}
slice = append(slice, upgrade.Upgrade{
Name: pkg.Name(),
Repository: pkg.DB().Name(),
LocalVersion: localVer,
RemoteVersion: pkg.Version(),
Reason: reason,
})
ups[pkg.Name()] = db.SyncUpgrade{
Package: pkg,
Reason: reason,
LocalVersion: localVer,
}
return nil
})
return slice, errReturn
return ups, errReturn
}
func (ae *AlpmExecutor) BiggestPackages() []alpm.IPackage {

View File

@ -1,11 +1,18 @@
//go:build !integration
// +build !integration
package ialpm
import (
"io"
"strings"
"testing"
alpm "github.com/Jguer/go-alpm/v2"
"github.com/Morganamilo/go-pacmanconf"
"github.com/stretchr/testify/assert"
"github.com/Jguer/yay/v12/pkg/text"
)
func TestAlpmExecutor(t *testing.T) {
@ -41,7 +48,7 @@ func TestAlpmExecutor(t *testing.T) {
},
}
aExec, err := NewExecutor(pacmanConf)
aExec, err := NewExecutor(pacmanConf, text.NewLogger(io.Discard, io.Discard, strings.NewReader(""), false, "test"))
assert.NoError(t, err)
assert.NotNil(t, aExec.conf)

View File

@ -0,0 +1,54 @@
package ialpm
import (
alpm "github.com/Jguer/go-alpm/v2"
"github.com/Jguer/yay/v12/pkg/text"
)
// GetPackageNamesBySource returns package names with and without correspondence in SyncDBS respectively.
func (ae *AlpmExecutor) getPackageNamesBySource() {
if ae.installedRemotePkgMap == nil {
ae.installedRemotePkgMap = map[string]alpm.IPackage{}
}
for _, localpkg := range ae.LocalPackages() {
pkgName := localpkg.Name()
if ae.SyncPackage(pkgName) != nil {
ae.installedSyncPkgNames = append(ae.installedSyncPkgNames, pkgName)
} else {
ae.installedRemotePkgNames = append(ae.installedRemotePkgNames, pkgName)
ae.installedRemotePkgMap[pkgName] = localpkg
}
}
ae.log.Debugln("populating db executor package caches.",
"sync_len", len(ae.installedSyncPkgNames), "remote_len", len(ae.installedRemotePkgNames))
}
func (ae *AlpmExecutor) InstalledRemotePackages() map[string]alpm.IPackage {
if ae.installedRemotePkgMap == nil {
ae.getPackageNamesBySource()
}
return ae.installedRemotePkgMap
}
func (ae *AlpmExecutor) InstalledRemotePackageNames() []string {
if ae.installedRemotePkgNames == nil {
ae.getPackageNamesBySource()
}
return ae.installedRemotePkgNames
}
func (ae *AlpmExecutor) InstalledSyncPackageNames() []string {
if ae.installedSyncPkgNames == nil {
ae.getPackageNamesBySource()
}
return ae.installedSyncPkgNames
}
func (ae *AlpmExecutor) SetLogger(logger *text.Logger) {
ae.log = logger
}

View File

@ -3,7 +3,8 @@ package mock
import (
"time"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/go-alpm/v2"
)
@ -14,92 +15,200 @@ type (
Upgrade = db.Upgrade
)
type DBExecutor struct{}
type DBExecutor struct {
db.Executor
AlpmArchitecturesFn func() ([]string, error)
InstalledRemotePackageNamesFn func() []string
InstalledRemotePackagesFn func() map[string]IPackage
IsCorrectVersionInstalledFn func(string, string) bool
LocalPackageFn func(string) IPackage
LocalPackagesFn func() []IPackage
LocalSatisfierExistsFn func(string) bool
PackageDependsFn func(IPackage) []Depend
PackageOptionalDependsFn func(alpm.IPackage) []alpm.Depend
PackageProvidesFn func(IPackage) []Depend
PackagesFromGroupFn func(string) []IPackage
PackagesFromGroupAndDBFn func(string, string) ([]IPackage, error)
RefreshHandleFn func() error
ReposFn func() []string
SyncPackageFn func(string) IPackage
SyncPackagesFn func(...string) []IPackage
SyncSatisfierFn func(string) IPackage
SatisfierFromDBFn func(string, string) (IPackage, error)
SyncUpgradesFn func(bool) (map[string]db.SyncUpgrade, error)
SetLoggerFn func(*text.Logger)
}
func (t DBExecutor) AlpmArchitectures() ([]string, error) {
func (t *DBExecutor) InstalledRemotePackageNames() []string {
if t.InstalledRemotePackageNamesFn != nil {
return t.InstalledRemotePackageNamesFn()
}
panic("implement me")
}
func (t DBExecutor) BiggestPackages() []IPackage {
func (t *DBExecutor) InstalledRemotePackages() map[string]IPackage {
if t.InstalledRemotePackagesFn != nil {
return t.InstalledRemotePackagesFn()
}
panic("implement me")
}
func (t DBExecutor) Cleanup() {
func (t *DBExecutor) AlpmArchitectures() ([]string, error) {
if t.AlpmArchitecturesFn != nil {
return t.AlpmArchitecturesFn()
}
panic("implement me")
}
func (t DBExecutor) IsCorrectVersionInstalled(s, s2 string) bool {
func (t *DBExecutor) BiggestPackages() []IPackage {
panic("implement me")
}
func (t DBExecutor) LastBuildTime() time.Time {
func (t *DBExecutor) Cleanup() {
panic("implement me")
}
func (t DBExecutor) LocalPackage(s string) IPackage {
return nil
}
func (t DBExecutor) LocalPackages() []IPackage {
func (t *DBExecutor) IsCorrectVersionInstalled(s, s2 string) bool {
if t.IsCorrectVersionInstalledFn != nil {
return t.IsCorrectVersionInstalledFn(s, s2)
}
panic("implement me")
}
func (t DBExecutor) LocalSatisfierExists(s string) bool {
func (t *DBExecutor) LastBuildTime() time.Time {
panic("implement me")
}
func (t DBExecutor) PackageConflicts(iPackage IPackage) []Depend {
func (t *DBExecutor) LocalPackage(s string) IPackage {
if t.LocalPackageFn != nil {
return t.LocalPackageFn(s)
}
panic("implement me")
}
func (t DBExecutor) PackageDepends(iPackage IPackage) []Depend {
func (t *DBExecutor) LocalPackages() []IPackage {
if t.LocalPackagesFn != nil {
return t.LocalPackagesFn()
}
panic("implement me")
}
func (t DBExecutor) PackageGroups(iPackage IPackage) []string {
func (t *DBExecutor) LocalSatisfierExists(s string) bool {
if t.LocalSatisfierExistsFn != nil {
return t.LocalSatisfierExistsFn(s)
}
panic("implement me")
}
func (t *DBExecutor) PackageConflicts(iPackage IPackage) []Depend {
panic("implement me")
}
func (t *DBExecutor) PackageDepends(iPackage IPackage) []Depend {
if t.PackageDependsFn != nil {
return t.PackageDependsFn(iPackage)
}
panic("implement me")
}
func (t *DBExecutor) PackageGroups(iPackage IPackage) []string {
return []string{}
}
func (t DBExecutor) PackageOptionalDepends(iPackage IPackage) []Depend {
func (t *DBExecutor) PackageOptionalDepends(iPackage IPackage) []Depend {
if t.PackageOptionalDependsFn != nil {
return t.PackageOptionalDependsFn(iPackage)
}
panic("implement me")
}
func (t DBExecutor) PackageProvides(iPackage IPackage) []Depend {
func (t *DBExecutor) PackageProvides(iPackage IPackage) []Depend {
if t.PackageProvidesFn != nil {
return t.PackageProvidesFn(iPackage)
}
panic("implement me")
}
func (t DBExecutor) PackagesFromGroup(s string) []IPackage {
func (t *DBExecutor) PackagesFromGroup(s string) []IPackage {
if t.PackagesFromGroupFn != nil {
return t.PackagesFromGroupFn(s)
}
panic("implement me")
}
func (t DBExecutor) RefreshHandle() error {
func (t *DBExecutor) PackagesFromGroupAndDB(s, s2 string) ([]IPackage, error) {
if t.PackagesFromGroupAndDBFn != nil {
return t.PackagesFromGroupAndDBFn(s, s2)
}
panic("implement me")
}
func (t DBExecutor) RepoUpgrades(b bool) ([]Upgrade, error) {
func (t *DBExecutor) RefreshHandle() error {
if t.RefreshHandleFn != nil {
return t.RefreshHandleFn()
}
panic("implement me")
}
func (t DBExecutor) Repos() []string {
func (t *DBExecutor) SyncUpgrades(b bool) (map[string]db.SyncUpgrade, error) {
if t.SyncUpgradesFn != nil {
return t.SyncUpgradesFn(b)
}
panic("implement me")
}
func (t DBExecutor) SatisfierFromDB(s, s2 string) IPackage {
func (t *DBExecutor) Repos() []string {
if t.ReposFn != nil {
return t.ReposFn()
}
panic("implement me")
}
func (t DBExecutor) SyncPackage(s string) IPackage {
func (t *DBExecutor) SatisfierFromDB(s, s2 string) (IPackage, error) {
if t.SatisfierFromDBFn != nil {
return t.SatisfierFromDBFn(s, s2)
}
panic("implement me")
}
func (t DBExecutor) SyncPackages(s ...string) []IPackage {
func (t *DBExecutor) SyncPackage(s string) IPackage {
if t.SyncPackageFn != nil {
return t.SyncPackageFn(s)
}
panic("implement me")
}
func (t DBExecutor) SyncSatisfier(s string) IPackage {
func (t *DBExecutor) SyncPackages(s ...string) []IPackage {
if t.SyncPackagesFn != nil {
return t.SyncPackagesFn(s...)
}
panic("implement me")
}
func (t DBExecutor) SyncSatisfierExists(s string) bool {
func (t *DBExecutor) SyncSatisfier(s string) IPackage {
if t.SyncSatisfierFn != nil {
return t.SyncSatisfierFn(s)
}
panic("implement me")
}
func (t *DBExecutor) SyncSatisfierExists(s string) bool {
if t.SyncSatisfierFn != nil {
return t.SyncSatisfierFn(s) != nil
}
panic("implement me")
}
func (t *DBExecutor) SetLogger(logger *text.Logger) {
if t.SetLoggerFn != nil {
t.SetLoggerFn(logger)
return
}
panic("implement me")
}

View File

@ -6,6 +6,26 @@ import (
alpm "github.com/Jguer/go-alpm/v2"
)
type DependList struct {
Depends []Depend
}
func (d DependList) Slice() []alpm.Depend {
return d.Depends
}
func (d DependList) ForEach(f func(*alpm.Depend) error) error {
for i := range d.Depends {
dep := &d.Depends[i]
err := f(dep)
if err != nil {
return err
}
}
return nil
}
type Package struct {
PBase string
PBuildDate time.Time
@ -17,6 +37,8 @@ type Package struct {
PSize int64
PVersion string
PReason alpm.PkgReason
PDepends alpm.IDependList
PProvides alpm.IDependList
}
func (p *Package) Base() string {
@ -60,131 +82,137 @@ func (p *Package) Reason() alpm.PkgReason {
}
func (p *Package) FileName() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
func (p *Package) Base64Signature() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
func (p *Package) Validation() alpm.Validation {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Architecture returns the package target Architecture.
func (p *Package) Architecture() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Backup returns a list of package backups.
func (p *Package) Backup() alpm.BackupList {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Conflicts returns the conflicts of the package as a DependList.
func (p *Package) Conflicts() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) Conflicts() alpm.IDependList {
panic("not implemented")
}
// Depends returns the package's dependency list.
func (p *Package) Depends() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) Depends() alpm.IDependList {
if p.PDepends != nil {
return p.PDepends
}
return alpm.DependList{}
}
// Depends returns the package's optional dependency list.
func (p *Package) OptionalDepends() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) OptionalDepends() alpm.IDependList {
panic("not implemented")
}
// Depends returns the package's check dependency list.
func (p *Package) CheckDepends() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) CheckDepends() alpm.IDependList {
panic("not implemented")
}
// Depends returns the package's make dependency list.
func (p *Package) MakeDepends() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) MakeDepends() alpm.IDependList {
panic("not implemented")
}
// Files returns the file list of the package.
func (p *Package) Files() []alpm.File {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// ContainsFile checks if the path is in the package filelist.
func (p *Package) ContainsFile(path string) (alpm.File, error) {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Groups returns the groups the package belongs to.
func (p *Package) Groups() alpm.StringList {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// InstallDate returns the package install date.
func (p *Package) InstallDate() time.Time {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Licenses returns the package license list.
func (p *Package) Licenses() alpm.StringList {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// SHA256Sum returns package SHA256Sum.
func (p *Package) SHA256Sum() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// MD5Sum returns package MD5Sum.
func (p *Package) MD5Sum() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Packager returns package packager name.
func (p *Package) Packager() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Provides returns DependList of packages provides by package.
func (p *Package) Provides() alpm.DependList {
return alpm.DependList{}
func (p *Package) Provides() alpm.IDependList {
if p.PProvides == nil {
return alpm.DependList{}
}
return p.PProvides
}
// Origin returns package origin.
func (p *Package) Origin() alpm.PkgFrom {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// Replaces returns a DependList with the packages this package replaces.
func (p *Package) Replaces() alpm.DependList {
panic("not implemented") // TODO: Implement
func (p *Package) Replaces() alpm.IDependList {
panic("not implemented")
}
// URL returns the upstream URL of the package.
func (p *Package) URL() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// ComputeRequiredBy returns the names of reverse dependencies of a package.
func (p *Package) ComputeRequiredBy() []string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// ComputeOptionalFor returns the names of packages that optionally
// require the given package.
func (p *Package) ComputeOptionalFor() []string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
// SyncNewVersion checks if there is a new version of the
// package in a given DBlist.
func (p *Package) SyncNewVersion(l alpm.IDBList) alpm.IPackage {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
func (p *Package) Type() string {
panic("not implemented") // TODO: Implement
panic("not implemented")
}
type DB struct {

15
pkg/db/types.go Normal file
View File

@ -0,0 +1,15 @@
package db
func ArchIsSupported(alpmArch []string, arch string) bool {
if arch == "any" {
return true
}
for _, a := range alpmArch {
if a == arch {
return true
}
}
return false
}

View File

@ -1,68 +0,0 @@
package dep
import (
aur "github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/stringset"
)
// Base is an AUR base package.
type Base []*aur.Pkg
// Pkgbase returns the first base package.
func (b Base) Pkgbase() string {
return b[0].PackageBase
}
// Version returns the first base package version.
func (b Base) Version() string {
return b[0].Version
}
// URLPath returns the first base package URL.
func (b Base) URLPath() string {
return b[0].URLPath
}
func (b Base) AnyIsInSet(set stringset.StringSet) bool {
for _, pkg := range b {
if set.Get(pkg.Name) {
return true
}
}
return false
}
// Packages foo and bar from a pkgbase named base would print like so:
// base (foo bar).
func (b Base) String() string {
pkg := b[0]
str := pkg.PackageBase
if len(b) > 1 || pkg.PackageBase != pkg.Name {
str2 := " ("
for _, split := range b {
str2 += split.Name + " "
}
str2 = str2[:len(str2)-1] + ")"
str += str2
}
return str
}
func GetBases(pkgs []*aur.Pkg) []Base {
basesMap := make(map[string]Base)
for _, pkg := range pkgs {
basesMap[pkg.PackageBase] = append(basesMap[pkg.PackageBase], pkg)
}
bases := make([]Base, 0, len(basesMap))
for _, base := range basesMap {
bases = append(bases, base)
}
return bases
}

View File

@ -3,43 +3,10 @@ package dep
import (
"strings"
"github.com/Jguer/yay/v11/pkg/db"
aur "github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/db"
aur "github.com/Jguer/yay/v12/pkg/query"
)
type providers struct {
lookfor string
Pkgs []*aur.Pkg
}
func makeProviders(name string) providers {
return providers{
name,
make([]*aur.Pkg, 0),
}
}
func (q providers) Len() int {
return len(q.Pkgs)
}
func (q providers) Less(i, j int) bool {
if q.lookfor == q.Pkgs[i].Name {
return true
}
if q.lookfor == q.Pkgs[j].Name {
return false
}
return text.LessRunes([]rune(q.Pkgs[i].Name), []rune(q.Pkgs[j].Name))
}
func (q providers) Swap(i, j int) {
q.Pkgs[i], q.Pkgs[j] = q.Pkgs[j], q.Pkgs[i]
}
func splitDep(dep string) (pkg, mod, ver string) {
split := strings.FieldsFunc(dep, func(c rune) bool {
match := c == '>' || c == '<' || c == '='
@ -80,7 +47,7 @@ func provideSatisfies(provide, dep, pkgVersion string) bool {
return false
}
// Unversioned provieds can not satisfy a versioned dep
// Unversioned provides can not satisfy a versioned dep
if provideMod == "" && depMod != "" {
provideVersion = pkgVersion // Example package: pagure
}
@ -118,17 +85,3 @@ func satisfiesAur(dep string, pkg *aur.Pkg) bool {
return false
}
func satisfiesRepo(dep string, pkg db.IPackage, dbExecutor db.Executor) bool {
if pkgSatisfies(pkg.Name(), pkg.Version(), dep) {
return true
}
for _, provided := range dbExecutor.PackageProvides(pkg) {
if provideSatisfies(provided.String(), dep, pkg.Version()) {
return true
}
}
return false
}

View File

@ -1,324 +0,0 @@
package dep
import (
"errors"
"fmt"
"os"
"strings"
"sync"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
)
func (dp *Pool) checkInnerConflict(name, conflict string, conflicts stringset.MapStringSet) {
for _, pkg := range dp.Aur {
if pkg.Name == name {
continue
}
if satisfiesAur(conflict, pkg) {
conflicts.Add(name, pkg.Name)
}
}
for _, pkg := range dp.Repo {
if pkg.Name() == name {
continue
}
if satisfiesRepo(conflict, pkg, dp.AlpmExecutor) {
conflicts.Add(name, pkg.Name())
}
}
}
func (dp *Pool) checkForwardConflict(name, conflict string, conflicts stringset.MapStringSet) {
for _, pkg := range dp.AlpmExecutor.LocalPackages() {
if pkg.Name() == name || dp.hasPackage(pkg.Name()) {
continue
}
if satisfiesRepo(conflict, pkg, dp.AlpmExecutor) {
n := pkg.Name()
if n != conflict {
n += " (" + conflict + ")"
}
conflicts.Add(name, n)
}
}
}
func (dp *Pool) checkReverseConflict(name, conflict string, conflicts stringset.MapStringSet) {
for _, pkg := range dp.Aur {
if pkg.Name == name {
continue
}
if satisfiesAur(conflict, pkg) {
if name != conflict {
name += " (" + conflict + ")"
}
conflicts.Add(pkg.Name, name)
}
}
for _, pkg := range dp.Repo {
if pkg.Name() == name {
continue
}
if satisfiesRepo(conflict, pkg, dp.AlpmExecutor) {
if name != conflict {
name += " (" + conflict + ")"
}
conflicts.Add(pkg.Name(), name)
}
}
}
func (dp *Pool) checkInnerConflicts(conflicts stringset.MapStringSet) {
for _, pkg := range dp.Aur {
for _, conflict := range pkg.Conflicts {
dp.checkInnerConflict(pkg.Name, conflict, conflicts)
}
}
for _, pkg := range dp.Repo {
for _, conflict := range dp.AlpmExecutor.PackageConflicts(pkg) {
dp.checkInnerConflict(pkg.Name(), conflict.String(), conflicts)
}
}
}
func (dp *Pool) checkForwardConflicts(conflicts stringset.MapStringSet) {
for _, pkg := range dp.Aur {
for _, conflict := range pkg.Conflicts {
dp.checkForwardConflict(pkg.Name, conflict, conflicts)
}
}
for _, pkg := range dp.Repo {
for _, conflict := range dp.AlpmExecutor.PackageConflicts(pkg) {
dp.checkForwardConflict(pkg.Name(), conflict.String(), conflicts)
}
}
}
func (dp *Pool) checkReverseConflicts(conflicts stringset.MapStringSet) {
for _, pkg := range dp.AlpmExecutor.LocalPackages() {
if dp.hasPackage(pkg.Name()) {
continue
}
for _, conflict := range dp.AlpmExecutor.PackageConflicts(pkg) {
dp.checkReverseConflict(pkg.Name(), conflict.String(), conflicts)
}
}
}
func (dp *Pool) CheckConflicts(useAsk, noConfirm, noDeps bool) (stringset.MapStringSet, error) {
conflicts := make(stringset.MapStringSet)
if noDeps {
return conflicts, nil
}
var wg sync.WaitGroup
innerConflicts := make(stringset.MapStringSet)
wg.Add(2)
text.OperationInfoln(gotext.Get("Checking for conflicts..."))
go func() {
dp.checkForwardConflicts(conflicts)
dp.checkReverseConflicts(conflicts)
wg.Done()
}()
text.OperationInfoln(gotext.Get("Checking for inner conflicts..."))
go func() {
dp.checkInnerConflicts(innerConflicts)
wg.Done()
}()
wg.Wait()
if len(innerConflicts) != 0 {
text.Errorln(gotext.Get("Inner conflicts found:"))
for name, pkgs := range innerConflicts {
str := text.SprintError(name + ":")
for pkg := range pkgs {
str += " " + text.Cyan(pkg) + ","
}
str = strings.TrimSuffix(str, ",")
fmt.Println(str)
}
}
if len(conflicts) != 0 {
text.Errorln(gotext.Get("Package conflicts found:"))
for name, pkgs := range conflicts {
str := text.SprintError(gotext.Get("Installing %s will remove:", text.Cyan(name)))
for pkg := range pkgs {
str += " " + text.Cyan(pkg) + ","
}
str = strings.TrimSuffix(str, ",")
fmt.Println(str)
}
}
// Add the inner conflicts to the conflicts
// These are used to decide what to pass --ask to (if set) or don't pass --noconfirm to
// As we have no idea what the order is yet we add every inner conflict to the slice
for name, pkgs := range innerConflicts {
conflicts[name] = make(stringset.StringSet)
for pkg := range pkgs {
conflicts[pkg] = make(stringset.StringSet)
}
}
if len(conflicts) > 0 {
if !useAsk {
if noConfirm {
return nil, errors.New(gotext.Get("package conflicts can not be resolved with noconfirm, aborting"))
}
text.Errorln(gotext.Get("Conflicting packages will have to be confirmed manually"))
}
}
return conflicts, nil
}
type missing struct {
Good stringset.StringSet
Missing map[string][][]string
}
func (dp *Pool) _checkMissing(dep string, stack []string, missing *missing, noDeps, noCheckDeps bool) {
if missing.Good.Get(dep) {
return
}
if trees, ok := missing.Missing[dep]; ok {
for _, tree := range trees {
if stringSliceEqual(tree, stack) {
return
}
}
missing.Missing[dep] = append(missing.Missing[dep], stack)
return
}
if aurPkg := dp.findSatisfierAur(dep); aurPkg != nil {
missing.Good.Set(dep)
combinedDepList := ComputeCombinedDepList(aurPkg, noDeps, noCheckDeps)
for _, aurDep := range combinedDepList {
if dp.AlpmExecutor.LocalSatisfierExists(aurDep) {
missing.Good.Set(aurDep)
continue
}
dp._checkMissing(aurDep, append(stack, aurPkg.Name), missing, noDeps, noCheckDeps)
}
return
}
if repoPkg := dp.findSatisfierRepo(dep); repoPkg != nil {
missing.Good.Set(dep)
if noDeps {
return
}
for _, dep := range dp.AlpmExecutor.PackageDepends(repoPkg) {
if dp.AlpmExecutor.LocalSatisfierExists(dep.String()) {
missing.Good.Set(dep.String())
continue
}
dp._checkMissing(dep.String(), append(stack, repoPkg.Name()), missing, noDeps, noCheckDeps)
}
return
}
missing.Missing[dep] = [][]string{stack}
}
func stringSliceEqual(a, b []string) bool {
if a == nil && b == nil {
return true
}
if a == nil || b == nil {
return false
}
if len(a) != len(b) {
return false
}
for i := 0; i < len(a); i++ {
if a[i] != b[i] {
return false
}
}
return true
}
func (dp *Pool) CheckMissing(noDeps, noCheckDeps bool) error {
missing := &missing{
make(stringset.StringSet),
make(map[string][][]string),
}
for _, target := range dp.Targets {
dp._checkMissing(target.DepString(), make([]string, 0), missing, noDeps, noCheckDeps)
}
if len(missing.Missing) == 0 {
return nil
}
text.Errorln(gotext.Get("Could not find all required packages:"))
for dep, trees := range missing.Missing {
for _, tree := range trees {
fmt.Fprintf(os.Stderr, "\t%s", text.Cyan(dep))
if len(tree) == 0 {
fmt.Fprint(os.Stderr, gotext.Get(" (Target"))
} else {
fmt.Fprint(os.Stderr, gotext.Get(" (Wanted by: "))
for n := 0; n < len(tree)-1; n++ {
fmt.Fprint(os.Stderr, text.Cyan(tree[n]), " -> ")
}
fmt.Fprint(os.Stderr, text.Cyan(tree[len(tree)-1]))
}
fmt.Fprintln(os.Stderr, ")")
}
}
return fmt.Errorf("")
}

View File

@ -1,199 +0,0 @@
package dep
import (
"fmt"
"github.com/Jguer/yay/v11/pkg/db"
aur "github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
)
type Order struct {
Aur []Base
Repo []db.IPackage
Runtime stringset.StringSet
}
func newOrder() *Order {
return &Order{
make([]Base, 0),
make([]db.IPackage, 0),
make(stringset.StringSet),
}
}
func GetOrder(dp *Pool, noDeps, noCheckDeps bool) *Order {
do := newOrder()
for _, target := range dp.Targets {
dep := target.DepString()
if aurPkg := dp.Aur[dep]; aurPkg != nil && pkgSatisfies(aurPkg.Name, aurPkg.Version, dep) {
do.orderPkgAur(aurPkg, dp, true, noDeps, noCheckDeps)
} else if aurPkg := dp.findSatisfierAur(dep); aurPkg != nil {
do.orderPkgAur(aurPkg, dp, true, noDeps, noCheckDeps)
} else if repoPkg := dp.findSatisfierRepo(dep); repoPkg != nil {
do.orderPkgRepo(repoPkg, dp, true)
}
}
return do
}
func (do *Order) orderPkgAur(pkg *aur.Pkg, dp *Pool, runtime, noDeps, noCheckDeps bool) {
if runtime {
do.Runtime.Set(pkg.Name)
}
delete(dp.Aur, pkg.Name)
for i, dep := range ComputeCombinedDepList(pkg, noDeps, noCheckDeps) {
if aurPkg := dp.findSatisfierAur(dep); aurPkg != nil {
do.orderPkgAur(aurPkg, dp, runtime && i == 0, noDeps, noCheckDeps)
}
if repoPkg := dp.findSatisfierRepo(dep); repoPkg != nil {
do.orderPkgRepo(repoPkg, dp, runtime && i == 0)
}
}
for i, base := range do.Aur {
if base.Pkgbase() == pkg.PackageBase {
do.Aur[i] = append(base, pkg)
return
}
}
do.Aur = append(do.Aur, Base{pkg})
}
func (do *Order) orderPkgRepo(pkg db.IPackage, dp *Pool, runtime bool) {
if runtime {
do.Runtime.Set(pkg.Name())
}
delete(dp.Repo, pkg.Name())
for _, dep := range dp.AlpmExecutor.PackageDepends(pkg) {
if repoPkg := dp.findSatisfierRepo(dep.String()); repoPkg != nil {
do.orderPkgRepo(repoPkg, dp, runtime)
}
}
do.Repo = append(do.Repo, pkg)
}
func (do *Order) HasMake() bool {
lenAur := 0
for _, base := range do.Aur {
lenAur += len(base)
}
return len(do.Runtime) != lenAur+len(do.Repo)
}
func (do *Order) GetMake() []string {
makeOnly := []string{}
for _, base := range do.Aur {
for _, pkg := range base {
if !do.Runtime.Get(pkg.Name) {
makeOnly = append(makeOnly, pkg.Name)
}
}
}
for _, pkg := range do.Repo {
if !do.Runtime.Get(pkg.Name()) {
makeOnly = append(makeOnly, pkg.Name())
}
}
return makeOnly
}
// Print prints repository packages to be downloaded.
func (do *Order) Print() {
repo := ""
repoMake := ""
aurString := ""
aurMake := ""
repoLen := 0
repoMakeLen := 0
aurLen := 0
aurMakeLen := 0
for _, pkg := range do.Repo {
pkgStr := fmt.Sprintf(" %s-%s", pkg.Name(), pkg.Version())
if do.Runtime.Get(pkg.Name()) {
repo += pkgStr
repoLen++
} else {
repoMake += pkgStr
repoMakeLen++
}
}
for _, base := range do.Aur {
pkg := base.Pkgbase()
pkgStr := " " + pkg + "-" + base[0].Version
pkgStrMake := pkgStr
push := false
pushMake := false
switch {
case len(base) > 1, pkg != base[0].Name:
pkgStr += " ("
pkgStrMake += " ("
for _, split := range base {
if do.Runtime.Get(split.Name) {
pkgStr += split.Name + " "
aurLen++
push = true
} else {
pkgStrMake += split.Name + " "
aurMakeLen++
pushMake = true
}
}
pkgStr = pkgStr[:len(pkgStr)-1] + ")"
pkgStrMake = pkgStrMake[:len(pkgStrMake)-1] + ")"
case do.Runtime.Get(base[0].Name):
aurLen++
push = true
default:
aurMakeLen++
pushMake = true
}
if push {
aurString += pkgStr
}
if pushMake {
aurMake += pkgStrMake
}
}
printDownloads("Repo", repoLen, repo)
printDownloads("Repo Make", repoMakeLen, repoMake)
printDownloads("Aur", aurLen, aurString)
printDownloads("Aur Make", aurMakeLen, aurMake)
}
func printDownloads(repoName string, length int, packages string) {
if length < 1 {
return
}
repoInfo := fmt.Sprintf(text.Bold(text.Blue("[%s:%d]")), repoName, length)
fmt.Println(repoInfo + text.Cyan(packages))
}

View File

@ -1,586 +0,0 @@
package dep
import (
"context"
"fmt"
"os"
"sort"
"strconv"
"strings"
"sync"
"github.com/Jguer/aur"
alpm "github.com/Jguer/go-alpm/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/query"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
)
type Target struct {
DB string
Name string
Mod string
Version string
}
func ToTarget(pkg string) Target {
dbName, depString := text.SplitDBFromName(pkg)
name, mod, depVersion := splitDep(depString)
return Target{
DB: dbName,
Name: name,
Mod: mod,
Version: depVersion,
}
}
func (t Target) DepString() string {
return t.Name + t.Mod + t.Version
}
func (t Target) String() string {
if t.DB != "" {
return t.DB + "/" + t.DepString()
}
return t.DepString()
}
type Pool struct {
Targets []Target
Explicit stringset.StringSet
Repo map[string]db.IPackage
Aur map[string]*query.Pkg
AurCache map[string]*query.Pkg
Groups []string
AlpmExecutor db.Executor
Warnings *query.AURWarnings
aurClient aur.ClientInterface
}
func newPool(dbExecutor db.Executor, aurClient aur.ClientInterface) *Pool {
dp := &Pool{
Targets: []Target{},
Explicit: map[string]struct{}{},
Repo: map[string]alpm.IPackage{},
Aur: map[string]*aur.Pkg{},
AurCache: map[string]*aur.Pkg{},
Groups: []string{},
AlpmExecutor: dbExecutor,
Warnings: nil,
aurClient: aurClient,
}
return dp
}
// Includes db/ prefixes and group installs.
func (dp *Pool) ResolveTargets(ctx context.Context, pkgs []string,
mode parser.TargetMode,
ignoreProviders, noConfirm, provides bool, rebuild string, splitN int, noDeps, noCheckDeps bool, assumeInstalled []string,
) error {
// RPC requests are slow
// Combine as many AUR package requests as possible into a single RPC call
aurTargets := make(stringset.StringSet)
pkgs = query.RemoveInvalidTargets(pkgs, mode)
for _, pkg := range pkgs {
target := ToTarget(pkg)
// skip targets already satisfied
// even if the user enters db/pkg and aur/pkg the latter will
// still get skipped even if it's from a different database to
// the one specified
// this is how pacman behaves
if dp.hasPackage(target.DepString()) || isInAssumeInstalled(target.DepString(), assumeInstalled) {
continue
}
var foundPkg db.IPackage
// aur/ prefix means we only check the aur
if target.DB == "aur" || mode == parser.ModeAUR {
dp.Targets = append(dp.Targets, target)
aurTargets.Set(target.DepString())
continue
}
// If there's a different prefix only look in that repo
if target.DB != "" {
foundPkg = dp.AlpmExecutor.SatisfierFromDB(target.DepString(), target.DB)
} else {
// otherwise find it in any repo
foundPkg = dp.AlpmExecutor.SyncSatisfier(target.DepString())
}
if foundPkg != nil {
dp.Targets = append(dp.Targets, target)
dp.Explicit.Set(foundPkg.Name())
dp.ResolveRepoDependency(foundPkg, noDeps)
continue
} else {
// check for groups
// currently we don't resolve the packages in a group
// only check if the group exists
// would be better to check the groups from singleDB if
// the user specified a db but there's no easy way to do
// it without making alpm_lists so don't bother for now
// db/group is probably a rare use case
groupPackages := dp.AlpmExecutor.PackagesFromGroup(target.Name)
if len(groupPackages) > 0 {
dp.Groups = append(dp.Groups, target.String())
for _, pkg := range groupPackages {
dp.Explicit.Set(pkg.Name())
}
continue
}
}
// if there was no db prefix check the aur
if target.DB == "" {
aurTargets.Set(target.DepString())
}
dp.Targets = append(dp.Targets, target)
}
if len(aurTargets) > 0 && mode.AtLeastAUR() {
return dp.resolveAURPackages(ctx, aurTargets, true, ignoreProviders,
noConfirm, provides, rebuild, splitN, noDeps, noCheckDeps)
}
return nil
}
// Pseudo provides finder.
// Try to find provides by performing a search of the package name
// This effectively performs -Ss on each package
// then runs -Si on each result to cache the information.
//
// For example if you were to -S yay then yay -Ss would give:
// yay-git yay-bin yay realyog pacui pacui-git ruby-yard
// These packages will all be added to the cache in case they are needed later
// Ofcouse only the first three packages provide yay, the rest are just false
// positives.
//
// This method increases dependency resolve time.
func (dp *Pool) findProvides(ctx context.Context, pkgs stringset.StringSet) error {
var (
mux sync.Mutex
wg sync.WaitGroup
)
doSearch := func(pkg string) {
defer wg.Done()
var (
err error
results []query.Pkg
)
// Hack for a bigger search result, if the user wants
// java-envronment we can search for just java instead and get
// more hits.
pkg, _, _ = splitDep(pkg) // openimagedenoise-git > ispc-git #1234
words := strings.Split(pkg, "-")
for i := range words {
results, err = dp.aurClient.Search(ctx, strings.Join(words[:i+1], "-"), aur.None)
if err == nil {
break
}
}
if err != nil {
return
}
for iR := range results {
mux.Lock()
if _, ok := dp.AurCache[results[iR].Name]; !ok {
pkgs.Set(results[iR].Name)
}
mux.Unlock()
}
}
for pkg := range pkgs {
if dp.AlpmExecutor.LocalPackage(pkg) != nil {
continue
}
wg.Add(1)
go doSearch(pkg)
}
wg.Wait()
return nil
}
func (dp *Pool) cacheAURPackages(ctx context.Context, _pkgs stringset.StringSet, provides bool, splitN int) error {
pkgs := _pkgs.Copy()
toQuery := make([]string, 0)
for pkg := range pkgs {
if _, ok := dp.AurCache[pkg]; ok {
pkgs.Remove(pkg)
}
}
if len(pkgs) == 0 {
return nil
}
if provides {
err := dp.findProvides(ctx, pkgs)
if err != nil {
return err
}
}
for pkg := range pkgs {
if _, ok := dp.AurCache[pkg]; !ok {
name, _, ver := splitDep(pkg)
if ver != "" {
toQuery = append(toQuery, name, name+"-"+ver)
} else {
toQuery = append(toQuery, name)
}
}
}
info, err := query.AURInfo(ctx, dp.aurClient, toQuery, dp.Warnings, splitN)
if err != nil {
return err
}
for _, pkg := range info {
// Dump everything in cache just in case we need it later
dp.AurCache[pkg.Name] = pkg
}
return nil
}
// Compute dependency lists used in Package dep searching and ordering.
// Order sensitive TOFIX.
func ComputeCombinedDepList(pkg *aur.Pkg, noDeps, noCheckDeps bool) []string {
combinedDepList := make([]string, 0, len(pkg.Depends)+len(pkg.MakeDepends)+len(pkg.CheckDepends))
if !noDeps {
combinedDepList = append(combinedDepList, pkg.Depends...)
}
combinedDepList = append(combinedDepList, pkg.MakeDepends...)
if !noCheckDeps {
combinedDepList = append(combinedDepList, pkg.CheckDepends...)
}
return combinedDepList
}
func (dp *Pool) resolveAURPackages(ctx context.Context,
pkgs stringset.StringSet,
explicit, ignoreProviders, noConfirm, provides bool,
rebuild string, splitN int, noDeps, noCheckDeps bool,
) error {
newPackages := make(stringset.StringSet)
newAURPackages := make(stringset.StringSet)
err := dp.cacheAURPackages(ctx, pkgs, provides, splitN)
if err != nil {
return err
}
if len(pkgs) == 0 {
return nil
}
for name := range pkgs {
_, ok := dp.Aur[name]
if ok {
continue
}
pkg := dp.findSatisfierAurCache(name, ignoreProviders, noConfirm, provides)
if pkg == nil {
continue
}
if explicit {
dp.Explicit.Set(pkg.Name)
}
dp.Aur[pkg.Name] = pkg
combinedDepList := ComputeCombinedDepList(pkg, noDeps, noCheckDeps)
for _, dep := range combinedDepList {
newPackages.Set(dep)
}
}
for dep := range newPackages {
if dp.hasSatisfier(dep) {
continue
}
isInstalled := dp.AlpmExecutor.LocalSatisfierExists(dep)
hm := settings.HideMenus
settings.HideMenus = isInstalled
repoPkg := dp.AlpmExecutor.SyncSatisfier(dep) // has satisfier in repo: fetch it
settings.HideMenus = hm
if isInstalled && (rebuild != "tree" || repoPkg != nil) {
continue
}
if repoPkg != nil {
dp.ResolveRepoDependency(repoPkg, false)
continue
}
// assume it's in the aur
// ditch the versioning because the RPC can't handle it
newAURPackages.Set(dep)
}
err = dp.resolveAURPackages(ctx, newAURPackages, false, ignoreProviders,
noConfirm, provides, rebuild, splitN, noDeps, noCheckDeps)
return err
}
func (dp *Pool) ResolveRepoDependency(pkg db.IPackage, noDeps bool) {
dp.Repo[pkg.Name()] = pkg
if noDeps {
return
}
for _, dep := range dp.AlpmExecutor.PackageDepends(pkg) {
if dp.hasSatisfier(dep.String()) {
continue
}
// has satisfier installed: skip
if dp.AlpmExecutor.LocalSatisfierExists(dep.String()) {
continue
}
// has satisfier in repo: fetch it
if repoPkg := dp.AlpmExecutor.SyncSatisfier(dep.String()); repoPkg != nil {
dp.ResolveRepoDependency(repoPkg, noDeps)
}
}
}
func GetPool(ctx context.Context, pkgs []string,
warnings *query.AURWarnings,
dbExecutor db.Executor,
aurClient aur.ClientInterface,
mode parser.TargetMode,
ignoreProviders, noConfirm, provides bool,
rebuild string, splitN int, noDeps bool, noCheckDeps bool, assumeInstalled []string,
) (*Pool, error) {
dp := newPool(dbExecutor, aurClient)
dp.Warnings = warnings
err := dp.ResolveTargets(ctx, pkgs, mode, ignoreProviders, noConfirm, provides,
rebuild, splitN, noDeps, noCheckDeps, assumeInstalled)
return dp, err
}
func (dp *Pool) findSatisfierAur(dep string) *query.Pkg {
for _, pkg := range dp.Aur {
if satisfiesAur(dep, pkg) {
return pkg
}
}
return nil
}
// This is mostly used to promote packages from the cache
// to the Install list
// Provide a pacman style provider menu if there's more than one candidate
// This acts slightly differently from Pacman, It will give
// a menu even if a package with a matching name exists. I believe this
// method is better because most of the time you are choosing between
// foo and foo-git.
// Using Pacman's ways trying to install foo would never give you
// a menu.
// TODO: maybe intermix repo providers in the menu.
func (dp *Pool) findSatisfierAurCache(dep string, ignoreProviders, noConfirm, provides bool) *query.Pkg {
depName, _, _ := splitDep(dep)
seen := make(stringset.StringSet)
providerSlice := makeProviders(depName)
if dp.AlpmExecutor.LocalPackage(depName) != nil {
if pkg, ok := dp.AurCache[dep]; ok && pkgSatisfies(pkg.Name, pkg.Version, dep) {
return pkg
}
}
if ignoreProviders {
for _, pkg := range dp.AurCache {
if pkgSatisfies(pkg.Name, pkg.Version, dep) {
for _, target := range dp.Targets {
if target.Name == pkg.Name {
return pkg
}
}
}
}
}
for _, pkg := range dp.AurCache {
if seen.Get(pkg.Name) {
continue
}
if pkgSatisfies(pkg.Name, pkg.Version, dep) {
providerSlice.Pkgs = append(providerSlice.Pkgs, pkg)
seen.Set(pkg.Name)
continue
}
for _, provide := range pkg.Provides {
if provideSatisfies(provide, dep, pkg.Version) {
providerSlice.Pkgs = append(providerSlice.Pkgs, pkg)
seen.Set(pkg.Name)
continue
}
}
}
if !provides && providerSlice.Len() >= 1 {
return providerSlice.Pkgs[0]
}
if providerSlice.Len() == 1 {
return providerSlice.Pkgs[0]
}
if providerSlice.Len() > 1 {
sort.Sort(providerSlice)
return providerMenu(dep, providerSlice, noConfirm)
}
return nil
}
func (dp *Pool) findSatisfierRepo(dep string) db.IPackage {
for _, pkg := range dp.Repo {
if satisfiesRepo(dep, pkg, dp.AlpmExecutor) {
return pkg
}
}
return nil
}
func (dp *Pool) hasSatisfier(dep string) bool {
return dp.findSatisfierRepo(dep) != nil || dp.findSatisfierAur(dep) != nil
}
func (dp *Pool) hasPackage(name string) bool {
for _, pkg := range dp.Repo {
if pkg.Name() == name {
return true
}
}
for _, pkg := range dp.Aur {
if pkg.Name == name {
return true
}
}
for _, pkg := range dp.Groups {
if pkg == name {
return true
}
}
return false
}
func isInAssumeInstalled(name string, assumeInstalled []string) bool {
for _, pkgAndVersion := range assumeInstalled {
assumeName, _, _ := splitDep(pkgAndVersion)
depName, _, _ := splitDep(name)
if assumeName == depName {
return true
}
}
return false
}
func providerMenu(dep string, providers providers, noConfirm bool) *query.Pkg {
size := providers.Len()
str := text.Bold(gotext.Get("There are %d providers available for %s:", size, dep))
str += "\n"
size = 1
str += text.SprintOperationInfo(gotext.Get("Repository AUR"), "\n ")
for _, pkg := range providers.Pkgs {
str += fmt.Sprintf("%d) %s ", size, pkg.Name)
size++
}
text.OperationInfoln(str)
for {
fmt.Println(gotext.Get("\nEnter a number (default=1): "))
if noConfirm {
fmt.Println("1")
return providers.Pkgs[0]
}
numberBuf, err := text.GetInput("", false)
if err != nil {
fmt.Fprintln(os.Stderr, err)
break
}
if numberBuf == "" {
return providers.Pkgs[0]
}
num, err := strconv.Atoi(numberBuf)
if err != nil {
text.Errorln(gotext.Get("invalid number: %s", numberBuf))
continue
}
if num < 1 || num >= size {
text.Errorln(gotext.Get("invalid value: %d is not between %d and %d", num, 1, size-1))
continue
}
return providers.Pkgs[num-1]
}
return nil
}

853
pkg/dep/dep_graph.go Normal file
View File

@ -0,0 +1,853 @@
package dep
import (
"context"
"fmt"
"strconv"
aurc "github.com/Jguer/aur"
alpm "github.com/Jguer/go-alpm/v2"
gosrc "github.com/Morganamilo/go-srcinfo"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/dep/topo"
"github.com/Jguer/yay/v12/pkg/intrange"
aur "github.com/Jguer/yay/v12/pkg/query"
"github.com/Jguer/yay/v12/pkg/text"
)
type InstallInfo struct {
Source Source
Reason Reason
Version string
LocalVersion string
SrcinfoPath *string
AURBase *string
SyncDBName *string
IsGroup bool
Upgrade bool
Devel bool
}
func (i *InstallInfo) String() string {
return fmt.Sprintf("InstallInfo{Source: %v, Reason: %v}", i.Source, i.Reason)
}
type (
Reason uint
Source int
)
func (r Reason) String() string {
return ReasonNames[r]
}
func (s Source) String() string {
return SourceNames[s]
}
const (
Explicit Reason = iota // 0
Dep // 1
MakeDep // 2
CheckDep // 3
)
var ReasonNames = map[Reason]string{
Explicit: gotext.Get("Explicit"),
Dep: gotext.Get("Dependency"),
MakeDep: gotext.Get("Make Dependency"),
CheckDep: gotext.Get("Check Dependency"),
}
const (
AUR Source = iota
Sync
Local
SrcInfo
Missing
)
var SourceNames = map[Source]string{
AUR: gotext.Get("AUR"),
Sync: gotext.Get("Sync"),
Local: gotext.Get("Local"),
SrcInfo: gotext.Get("SRCINFO"),
Missing: gotext.Get("Missing"),
}
var bgColorMap = map[Source]string{
AUR: "lightblue",
Sync: "lemonchiffon",
Local: "darkolivegreen1",
Missing: "tomato",
}
var colorMap = map[Reason]string{
Explicit: "black",
Dep: "deeppink",
MakeDep: "navyblue",
CheckDep: "forestgreen",
}
type Grapher struct {
logger *text.Logger
providerCache map[string][]aur.Pkg
dbExecutor db.Executor
aurClient aurc.QueryClient
fullGraph bool // If true, the graph will include all dependencies including already installed ones or repo
noConfirm bool // If true, the graph will not prompt for confirmation
noDeps bool // If true, the graph will not include dependencies
noCheckDeps bool // If true, the graph will not include check dependencies
needed bool // If true, the graph will only include packages that are not installed
}
func NewGrapher(dbExecutor db.Executor, aurCache aurc.QueryClient,
fullGraph, noConfirm, noDeps, noCheckDeps, needed bool,
logger *text.Logger,
) *Grapher {
return &Grapher{
dbExecutor: dbExecutor,
aurClient: aurCache,
fullGraph: fullGraph,
noConfirm: noConfirm,
noDeps: noDeps,
noCheckDeps: noCheckDeps,
needed: needed,
providerCache: make(map[string][]aurc.Pkg, 5),
logger: logger,
}
}
func NewGraph() *topo.Graph[string, *InstallInfo] {
return topo.New[string, *InstallInfo]()
}
func (g *Grapher) GraphFromTargets(ctx context.Context,
graph *topo.Graph[string, *InstallInfo], targets []string,
) (*topo.Graph[string, *InstallInfo], error) {
if graph == nil {
graph = NewGraph()
}
aurTargets := make([]string, 0, len(targets))
for _, targetString := range targets {
target := ToTarget(targetString)
switch target.DB {
case "": // unspecified db
if pkg := g.dbExecutor.SyncSatisfier(target.Name); pkg != nil {
g.GraphSyncPkg(ctx, graph, pkg, nil)
continue
}
groupPackages := g.dbExecutor.PackagesFromGroup(target.Name)
if len(groupPackages) > 0 {
dbName := groupPackages[0].DB().Name()
g.GraphSyncGroup(ctx, graph, target.Name, dbName)
continue
}
fallthrough
case "aur":
aurTargets = append(aurTargets, target.Name)
default:
pkg, err := g.dbExecutor.SatisfierFromDB(target.Name, target.DB)
if err != nil {
return nil, err
}
if pkg != nil {
g.GraphSyncPkg(ctx, graph, pkg, nil)
continue
}
groupPackages, err := g.dbExecutor.PackagesFromGroupAndDB(target.Name, target.DB)
if err != nil {
return nil, err
}
if len(groupPackages) > 0 {
g.GraphSyncGroup(ctx, graph, target.Name, target.DB)
continue
}
g.logger.Errorln(gotext.Get("No package found for"), " ", target)
}
}
var errA error
graph, errA = g.GraphFromAUR(ctx, graph, aurTargets)
if errA != nil {
return nil, errA
}
return graph, nil
}
func (g *Grapher) pickSrcInfoPkgs(pkgs []*aurc.Pkg) ([]*aurc.Pkg, error) {
final := make([]*aurc.Pkg, 0, len(pkgs))
for i := range pkgs {
g.logger.Println(text.Magenta(strconv.Itoa(i+1)+" ") + text.Bold(pkgs[i].Name) +
" " + text.Cyan(pkgs[i].Version))
g.logger.Println(" " + pkgs[i].Description)
}
g.logger.Infoln(gotext.Get("Packages to exclude") + " (eg: \"1 2 3\", \"1-3\", \"^4\"):")
numberBuf, err := g.logger.GetInput("", g.noConfirm)
if err != nil {
return nil, err
}
include, exclude, _, otherExclude := intrange.ParseNumberMenu(numberBuf)
isInclude := len(exclude) == 0 && otherExclude.Cardinality() == 0
for i := 1; i <= len(pkgs); i++ {
target := i - 1
if isInclude && !include.Get(i) {
final = append(final, pkgs[target])
}
if !isInclude && (exclude.Get(i)) {
final = append(final, pkgs[target])
}
}
return final, nil
}
func (g *Grapher) addAurPkgProvides(pkg *aurc.Pkg, graph *topo.Graph[string, *InstallInfo]) {
for i := range pkg.Provides {
depName, mod, version := splitDep(pkg.Provides[i])
g.logger.Debugln(pkg.String() + " provides: " + depName)
graph.Provides(depName, &alpm.Depend{
Name: depName,
Version: version,
Mod: aurDepModToAlpmDep(mod),
}, pkg.Name)
}
}
func (g *Grapher) GraphFromSrcInfos(ctx context.Context, graph *topo.Graph[string, *InstallInfo],
srcInfos map[string]*gosrc.Srcinfo,
) (*topo.Graph[string, *InstallInfo], error) {
if graph == nil {
graph = NewGraph()
}
aurPkgsAdded := []*aurc.Pkg{}
for pkgBuildDir, pkgbuild := range srcInfos {
pkgBuildDir := pkgBuildDir
aurPkgs, err := makeAURPKGFromSrcinfo(g.dbExecutor, pkgbuild)
if err != nil {
return nil, err
}
if len(aurPkgs) > 1 {
var errPick error
aurPkgs, errPick = g.pickSrcInfoPkgs(aurPkgs)
if errPick != nil {
return nil, errPick
}
}
for _, pkg := range aurPkgs {
pkg := pkg
reason := Explicit
if pkg := g.dbExecutor.LocalPackage(pkg.Name); pkg != nil {
reason = Reason(pkg.Reason())
}
graph.AddNode(pkg.Name)
g.addAurPkgProvides(pkg, graph)
g.ValidateAndSetNodeInfo(graph, pkg.Name, &topo.NodeInfo[*InstallInfo]{
Color: colorMap[reason],
Background: bgColorMap[AUR],
Value: &InstallInfo{
Source: SrcInfo,
Reason: reason,
SrcinfoPath: &pkgBuildDir,
AURBase: &pkg.PackageBase,
Version: pkg.Version,
},
})
}
aurPkgsAdded = append(aurPkgsAdded, aurPkgs...)
}
g.AddDepsForPkgs(ctx, aurPkgsAdded, graph)
return graph, nil
}
func (g *Grapher) AddDepsForPkgs(ctx context.Context, pkgs []*aur.Pkg, graph *topo.Graph[string, *InstallInfo]) {
for _, pkg := range pkgs {
g.addDepNodes(ctx, pkg, graph)
}
}
func (g *Grapher) addDepNodes(ctx context.Context, pkg *aur.Pkg, graph *topo.Graph[string, *InstallInfo]) {
if len(pkg.MakeDepends) > 0 {
g.addNodes(ctx, graph, pkg.Name, pkg.MakeDepends, MakeDep)
}
if !g.noDeps && len(pkg.Depends) > 0 {
g.addNodes(ctx, graph, pkg.Name, pkg.Depends, Dep)
}
if !g.noCheckDeps && !g.noDeps && len(pkg.CheckDepends) > 0 {
g.addNodes(ctx, graph, pkg.Name, pkg.CheckDepends, CheckDep)
}
}
func (g *Grapher) GraphSyncPkg(ctx context.Context,
graph *topo.Graph[string, *InstallInfo],
pkg alpm.IPackage, upgradeInfo *db.SyncUpgrade,
) *topo.Graph[string, *InstallInfo] {
if graph == nil {
graph = NewGraph()
}
graph.AddNode(pkg.Name())
_ = pkg.Provides().ForEach(func(p *alpm.Depend) error {
g.logger.Debugln(pkg.Name() + " provides: " + p.String())
graph.Provides(p.Name, p, pkg.Name())
return nil
})
dbName := pkg.DB().Name()
info := &InstallInfo{
Source: Sync,
Reason: Explicit,
Version: pkg.Version(),
SyncDBName: &dbName,
}
if upgradeInfo == nil {
if localPkg := g.dbExecutor.LocalPackage(pkg.Name()); localPkg != nil {
info.Reason = Reason(localPkg.Reason())
}
} else {
info.Upgrade = true
info.Reason = Reason(upgradeInfo.Reason)
info.LocalVersion = upgradeInfo.LocalVersion
}
g.ValidateAndSetNodeInfo(graph, pkg.Name(), &topo.NodeInfo[*InstallInfo]{
Color: colorMap[info.Reason],
Background: bgColorMap[info.Source],
Value: info,
})
return graph
}
func (g *Grapher) GraphSyncGroup(ctx context.Context,
graph *topo.Graph[string, *InstallInfo],
groupName, dbName string,
) *topo.Graph[string, *InstallInfo] {
if graph == nil {
graph = NewGraph()
}
graph.AddNode(groupName)
g.ValidateAndSetNodeInfo(graph, groupName, &topo.NodeInfo[*InstallInfo]{
Color: colorMap[Explicit],
Background: bgColorMap[Sync],
Value: &InstallInfo{
Source: Sync,
Reason: Explicit,
Version: "",
SyncDBName: &dbName,
IsGroup: true,
},
})
return graph
}
func (g *Grapher) GraphAURTarget(ctx context.Context,
graph *topo.Graph[string, *InstallInfo],
pkg *aurc.Pkg, instalInfo *InstallInfo,
) *topo.Graph[string, *InstallInfo] {
if graph == nil {
graph = NewGraph()
}
graph.AddNode(pkg.Name)
g.addAurPkgProvides(pkg, graph)
g.ValidateAndSetNodeInfo(graph, pkg.Name, &topo.NodeInfo[*InstallInfo]{
Color: colorMap[instalInfo.Reason],
Background: bgColorMap[AUR],
Value: instalInfo,
})
return graph
}
func (g *Grapher) GraphFromAUR(ctx context.Context,
graph *topo.Graph[string, *InstallInfo],
targets []string,
) (*topo.Graph[string, *InstallInfo], error) {
if graph == nil {
graph = NewGraph()
}
if len(targets) == 0 {
return graph, nil
}
aurPkgs, errCache := g.aurClient.Get(ctx, &aurc.Query{By: aurc.Name, Needles: targets})
if errCache != nil {
g.logger.Errorln(errCache)
}
for i := range aurPkgs {
pkg := &aurPkgs[i]
if _, ok := g.providerCache[pkg.Name]; !ok {
g.providerCache[pkg.Name] = []aurc.Pkg{*pkg}
}
}
aurPkgsAdded := []*aurc.Pkg{}
for _, target := range targets {
if cachedProvidePkg, ok := g.providerCache[target]; ok {
aurPkgs = cachedProvidePkg
} else {
var errA error
aurPkgs, errA = g.aurClient.Get(ctx, &aurc.Query{By: aurc.Provides, Needles: []string{target}, Contains: true})
if errA != nil {
g.logger.Errorln(gotext.Get("Failed to find AUR package for"), " ", target, ":", errA)
}
}
if len(aurPkgs) == 0 {
g.logger.Errorln(gotext.Get("No AUR package found for"), " ", target)
continue
}
aurPkg := &aurPkgs[0]
if len(aurPkgs) > 1 {
chosen := g.provideMenu(target, aurPkgs)
aurPkg = chosen
g.providerCache[target] = []aurc.Pkg{*aurPkg}
}
reason := Explicit
if pkg := g.dbExecutor.LocalPackage(aurPkg.Name); pkg != nil {
reason = Reason(pkg.Reason())
if g.needed {
if db.VerCmp(pkg.Version(), aurPkg.Version) >= 0 {
g.logger.Warnln(gotext.Get("%s is up to date -- skipping", text.Cyan(pkg.Name()+"-"+pkg.Version())))
continue
}
}
}
graph = g.GraphAURTarget(ctx, graph, aurPkg, &InstallInfo{
AURBase: &aurPkg.PackageBase,
Reason: reason,
Source: AUR,
Version: aurPkg.Version,
})
aurPkgsAdded = append(aurPkgsAdded, aurPkg)
}
g.AddDepsForPkgs(ctx, aurPkgsAdded, graph)
return graph, nil
}
// Removes found deps from the deps mapset and returns the found deps.
func (g *Grapher) findDepsFromAUR(ctx context.Context,
deps mapset.Set[string],
) []aurc.Pkg {
pkgsToAdd := make([]aurc.Pkg, 0, deps.Cardinality())
if deps.Cardinality() == 0 {
return []aurc.Pkg{}
}
missingNeedles := make([]string, 0, deps.Cardinality())
for _, depString := range deps.ToSlice() {
if _, ok := g.providerCache[depString]; !ok {
depName, _, _ := splitDep(depString)
missingNeedles = append(missingNeedles, depName)
}
}
if len(missingNeedles) != 0 {
g.logger.Debugln("deps to find", missingNeedles)
// provider search is more demanding than a simple search
// try to find name match if possible and then try to find provides.
aurPkgs, errCache := g.aurClient.Get(ctx, &aurc.Query{
By: aurc.Name, Needles: missingNeedles, Contains: false,
})
if errCache != nil {
g.logger.Errorln(errCache)
}
for i := range aurPkgs {
pkg := &aurPkgs[i]
if deps.Contains(pkg.Name) {
g.providerCache[pkg.Name] = append(g.providerCache[pkg.Name], *pkg)
}
for _, val := range pkg.Provides {
if val == pkg.Name {
continue
}
if deps.Contains(val) {
g.providerCache[val] = append(g.providerCache[val], *pkg)
}
}
}
}
for _, depString := range deps.ToSlice() {
var aurPkgs []aurc.Pkg
depName, _, _ := splitDep(depString)
if cachedProvidePkg, ok := g.providerCache[depString]; ok {
aurPkgs = cachedProvidePkg
} else {
var errA error
aurPkgs, errA = g.aurClient.Get(ctx, &aurc.Query{By: aurc.Provides, Needles: []string{depName}, Contains: true})
if errA != nil {
g.logger.Errorln(gotext.Get("Failed to find AUR package for"), depString, ":", errA)
}
}
// remove packages that don't satisfy the dependency
satisfyingPkgs := make([]aurc.Pkg, 0, len(aurPkgs))
for i := range aurPkgs {
if satisfiesAur(depString, &aurPkgs[i]) {
satisfyingPkgs = append(satisfyingPkgs, aurPkgs[i])
}
}
aurPkgs = satisfyingPkgs
if len(aurPkgs) == 0 {
g.logger.Errorln(gotext.Get("No AUR package found for"), " ", depString)
continue
}
pkg := aurPkgs[0]
if len(aurPkgs) > 1 {
chosen := g.provideMenu(depString, aurPkgs)
pkg = *chosen
}
g.providerCache[depString] = []aurc.Pkg{pkg}
deps.Remove(depString)
pkgsToAdd = append(pkgsToAdd, pkg)
}
return pkgsToAdd
}
func (g *Grapher) ValidateAndSetNodeInfo(graph *topo.Graph[string, *InstallInfo],
node string, nodeInfo *topo.NodeInfo[*InstallInfo],
) {
info := graph.GetNodeInfo(node)
if info != nil && info.Value != nil {
if info.Value.Reason < nodeInfo.Value.Reason {
return // refuse to downgrade reason
}
if info.Value.Upgrade {
return // refuse to overwrite an upgrade
}
}
graph.SetNodeInfo(node, nodeInfo)
}
func (g *Grapher) addNodes(
ctx context.Context,
graph *topo.Graph[string, *InstallInfo],
parentPkgName string,
deps []string,
depType Reason,
) {
targetsToFind := mapset.NewThreadUnsafeSet(deps...)
// Check if in graph already
for _, depString := range targetsToFind.ToSlice() {
depName, _, _ := splitDep(depString)
if !graph.Exists(depName) && !graph.ProvidesExists(depName) {
continue
}
if graph.Exists(depName) {
if err := graph.DependOn(depName, parentPkgName); err != nil {
g.logger.Warnln(depString, parentPkgName, err)
}
targetsToFind.Remove(depString)
}
if p := graph.GetProviderNode(depName); p != nil {
if provideSatisfies(p.String(), depString, p.Version) {
if err := graph.DependOn(p.Provider, parentPkgName); err != nil {
g.logger.Warnln(p.Provider, parentPkgName, err)
}
targetsToFind.Remove(depString)
}
}
}
// Check installed
for _, depString := range targetsToFind.ToSlice() {
depName, _, _ := splitDep(depString)
if !g.dbExecutor.LocalSatisfierExists(depString) {
continue
}
if g.fullGraph {
g.ValidateAndSetNodeInfo(
graph,
depName,
&topo.NodeInfo[*InstallInfo]{Color: colorMap[depType], Background: bgColorMap[Local]})
if err := graph.DependOn(depName, parentPkgName); err != nil {
g.logger.Warnln(depName, parentPkgName, err)
}
}
targetsToFind.Remove(depString)
}
// Check Sync
for _, depString := range targetsToFind.ToSlice() {
alpmPkg := g.dbExecutor.SyncSatisfier(depString)
if alpmPkg == nil {
continue
}
if err := graph.DependOn(alpmPkg.Name(), parentPkgName); err != nil {
g.logger.Warnln("repo dep warn:", depString, parentPkgName, err)
}
dbName := alpmPkg.DB().Name()
g.ValidateAndSetNodeInfo(
graph,
alpmPkg.Name(),
&topo.NodeInfo[*InstallInfo]{
Color: colorMap[depType],
Background: bgColorMap[Sync],
Value: &InstallInfo{
Source: Sync,
Reason: depType,
Version: alpmPkg.Version(),
SyncDBName: &dbName,
},
})
if newDeps := alpmPkg.Depends().Slice(); len(newDeps) != 0 && g.fullGraph {
newDepsSlice := make([]string, 0, len(newDeps))
for _, newDep := range newDeps {
newDepsSlice = append(newDepsSlice, newDep.Name)
}
g.addNodes(ctx, graph, alpmPkg.Name(), newDepsSlice, Dep)
}
targetsToFind.Remove(depString)
}
// Check AUR
pkgsToAdd := g.findDepsFromAUR(ctx, targetsToFind)
for i := range pkgsToAdd {
aurPkg := &pkgsToAdd[i]
if err := graph.DependOn(aurPkg.Name, parentPkgName); err != nil {
g.logger.Warnln("aur dep warn:", aurPkg.Name, parentPkgName, err)
}
graph.SetNodeInfo(
aurPkg.Name,
&topo.NodeInfo[*InstallInfo]{
Color: colorMap[depType],
Background: bgColorMap[AUR],
Value: &InstallInfo{
Source: AUR,
Reason: depType,
AURBase: &aurPkg.PackageBase,
Version: aurPkg.Version,
},
})
g.addDepNodes(ctx, aurPkg, graph)
}
// Add missing to graph
for _, depString := range targetsToFind.ToSlice() {
depName, mod, ver := splitDep(depString)
// no dep found. add as missing
if err := graph.DependOn(depName, parentPkgName); err != nil {
g.logger.Warnln("missing dep warn:", depString, parentPkgName, err)
}
graph.SetNodeInfo(depName, &topo.NodeInfo[*InstallInfo]{
Color: colorMap[depType],
Background: bgColorMap[Missing],
Value: &InstallInfo{
Source: Missing,
Reason: depType,
Version: fmt.Sprintf("%s%s", mod, ver),
},
})
}
}
func (g *Grapher) provideMenu(dep string, options []aur.Pkg) *aur.Pkg {
size := len(options)
if size == 1 {
return &options[0]
}
str := text.Bold(gotext.Get("There are %[1]d providers available for %[2]s:", size, dep))
str += "\n"
size = 1
str += g.logger.SprintOperationInfo(gotext.Get("Repository AUR"), "\n ")
for i := range options {
str += fmt.Sprintf("%d) %s ", size, options[i].Name)
size++
}
g.logger.OperationInfoln(str)
for {
g.logger.Println(gotext.Get("\nEnter a number (default=1): "))
if g.noConfirm {
g.logger.Println("1")
return &options[0]
}
numberBuf, err := g.logger.GetInput("", false)
if err != nil {
g.logger.Errorln(err)
return &options[0]
}
if numberBuf == "" {
return &options[0]
}
num, err := strconv.Atoi(numberBuf)
if err != nil {
g.logger.Errorln(gotext.Get("invalid number: %s", numberBuf))
continue
}
if num < 1 || num >= size {
g.logger.Errorln(gotext.Get("invalid value: %d is not between %d and %d",
num, 1, size-1))
continue
}
return &options[num-1]
}
}
func makeAURPKGFromSrcinfo(dbExecutor db.Executor, srcInfo *gosrc.Srcinfo) ([]*aur.Pkg, error) {
pkgs := make([]*aur.Pkg, 0, 1)
alpmArch, err := dbExecutor.AlpmArchitectures()
if err != nil {
return nil, err
}
alpmArch = append(alpmArch, "") // srcinfo assumes no value as ""
getDesc := func(pkg *gosrc.Package) string {
if pkg.Pkgdesc != "" {
return pkg.Pkgdesc
}
return srcInfo.Pkgdesc
}
for i := range srcInfo.Packages {
pkg := &srcInfo.Packages[i]
pkgs = append(pkgs, &aur.Pkg{
ID: 0,
Name: pkg.Pkgname,
PackageBaseID: 0,
PackageBase: srcInfo.Pkgbase,
Version: srcInfo.Version(),
Description: getDesc(pkg),
URL: pkg.URL,
Depends: append(archStringToString(alpmArch, pkg.Depends),
archStringToString(alpmArch, srcInfo.Depends)...),
MakeDepends: archStringToString(alpmArch, srcInfo.MakeDepends),
CheckDepends: archStringToString(alpmArch, srcInfo.CheckDepends),
Conflicts: append(archStringToString(alpmArch, pkg.Conflicts),
archStringToString(alpmArch, srcInfo.Conflicts)...),
Provides: append(archStringToString(alpmArch, pkg.Provides),
archStringToString(alpmArch, srcInfo.Provides)...),
Replaces: append(archStringToString(alpmArch, pkg.Replaces),
archStringToString(alpmArch, srcInfo.Replaces)...),
OptDepends: append(archStringToString(alpmArch, pkg.OptDepends),
archStringToString(alpmArch, srcInfo.OptDepends)...),
Groups: pkg.Groups,
License: pkg.License,
Keywords: []string{},
})
}
return pkgs, nil
}
func archStringToString(alpmArches []string, archString []gosrc.ArchString) []string {
pkgs := make([]string, 0, len(archString))
for _, arch := range archString {
if db.ArchIsSupported(alpmArches, arch.Arch) {
pkgs = append(pkgs, arch.Value)
}
}
return pkgs
}
func aurDepModToAlpmDep(mod string) alpm.DepMod {
switch mod {
case "=":
return alpm.DepModEq
case ">=":
return alpm.DepModGE
case "<=":
return alpm.DepModLE
case ">":
return alpm.DepModGT
case "<":
return alpm.DepModLT
}
return alpm.DepModAny
}

811
pkg/dep/dep_graph_test.go Normal file
View File

@ -0,0 +1,811 @@
//go:build !integration
// +build !integration
package dep
import (
"context"
"encoding/json"
"fmt"
"io"
"os"
"testing"
aurc "github.com/Jguer/aur"
alpm "github.com/Jguer/go-alpm/v2"
"github.com/stretchr/testify/require"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/db/mock"
mockaur "github.com/Jguer/yay/v12/pkg/dep/mock"
aur "github.com/Jguer/yay/v12/pkg/query"
"github.com/Jguer/yay/v12/pkg/text"
)
func ptrString(s string) *string {
return &s
}
func getFromFile(t *testing.T, filePath string) mockaur.GetFunc {
f, err := os.Open(filePath)
require.NoError(t, err)
fBytes, err := io.ReadAll(f)
require.NoError(t, err)
pkgs := []aur.Pkg{}
err = json.Unmarshal(fBytes, &pkgs)
require.NoError(t, err)
return func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
return pkgs, nil
}
}
func TestGrapher_GraphFromTargets_jellyfin(t *testing.T) {
mockDB := &mock.DBExecutor{
SyncPackageFn: func(string) mock.IPackage { return nil },
SyncSatisfierFn: func(s string) mock.IPackage {
switch s {
case "jellyfin":
return nil
case "dotnet-runtime-6.0":
return &mock.Package{
PName: "dotnet-runtime-6.0",
PBase: "dotnet-runtime-6.0",
PVersion: "6.0.100-1",
PDB: mock.NewDB("community"),
}
case "dotnet-sdk-6.0":
return &mock.Package{
PName: "dotnet-sdk-6.0",
PBase: "dotnet-sdk-6.0",
PVersion: "6.0.100-1",
PDB: mock.NewDB("community"),
}
}
return nil
},
PackagesFromGroupFn: func(string) []mock.IPackage { return nil },
LocalSatisfierExistsFn: func(s string) bool {
switch s {
case "dotnet-sdk-6.0", "dotnet-runtime-6.0", "jellyfin-server=10.8.8", "jellyfin-web=10.8.8":
return false
}
return true
},
LocalPackageFn: func(string) mock.IPackage { return nil },
}
mockAUR := &mockaur.MockAUR{GetFn: func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
if query.Needles[0] == "jellyfin" {
jfinFn := getFromFile(t, "testdata/jellyfin.json")
return jfinFn(ctx, query)
}
if query.Needles[0] == "jellyfin-web" {
jfinWebFn := getFromFile(t, "testdata/jellyfin-web.json")
return jfinWebFn(ctx, query)
}
if query.Needles[0] == "jellyfin-server" {
jfinServerFn := getFromFile(t, "testdata/jellyfin-server.json")
return jfinServerFn(ctx, query)
}
panic(fmt.Sprintf("implement me %v", query.Needles))
}}
type fields struct {
dbExecutor db.Executor
aurCache aurc.QueryClient
noDeps bool
noCheckDeps bool
}
type args struct {
targets []string
}
tests := []struct {
name string
fields fields
args args
want []map[string]*InstallInfo
wantErr bool
}{
{
name: "noDeps",
fields: fields{
dbExecutor: mockDB,
aurCache: mockAUR,
noDeps: true,
noCheckDeps: false,
},
args: args{
targets: []string{"jellyfin"},
},
want: []map[string]*InstallInfo{
{
"jellyfin": {
Source: AUR,
Reason: Explicit,
Version: "10.8.8-1",
AURBase: ptrString("jellyfin"),
},
},
{
"dotnet-sdk-6.0": {
Source: Sync,
Reason: MakeDep,
Version: "6.0.100-1",
SyncDBName: ptrString("community"),
},
},
},
wantErr: false,
},
{
name: "deps",
fields: fields{
dbExecutor: mockDB,
aurCache: mockAUR,
noDeps: false,
noCheckDeps: false,
},
args: args{
targets: []string{"jellyfin"},
},
want: []map[string]*InstallInfo{
{
"jellyfin": {
Source: AUR,
Reason: Explicit,
Version: "10.8.8-1",
AURBase: ptrString("jellyfin"),
},
},
{
"jellyfin-web": {
Source: AUR,
Reason: Dep,
Version: "10.8.8-1",
AURBase: ptrString("jellyfin"),
},
"jellyfin-server": {
Source: AUR,
Reason: Dep,
Version: "10.8.8-1",
AURBase: ptrString("jellyfin"),
},
},
{
"dotnet-sdk-6.0": {
Source: Sync,
Reason: MakeDep,
Version: "6.0.100-1",
SyncDBName: ptrString("community"),
},
"dotnet-runtime-6.0": {
Source: Sync,
Reason: Dep,
Version: "6.0.100-1",
SyncDBName: ptrString("community"),
},
},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
g := NewGrapher(tt.fields.dbExecutor,
tt.fields.aurCache, false, true,
tt.fields.noDeps, tt.fields.noCheckDeps, false,
text.NewLogger(io.Discard, io.Discard, &os.File{}, true, "test"))
got, err := g.GraphFromTargets(context.Background(), nil, tt.args.targets)
require.NoError(t, err)
layers := got.TopoSortedLayerMap(nil)
require.EqualValues(t, tt.want, layers, layers)
})
}
}
func TestGrapher_GraphProvides_androidsdk(t *testing.T) {
mockDB := &mock.DBExecutor{
SyncPackageFn: func(string) mock.IPackage { return nil },
SyncSatisfierFn: func(s string) mock.IPackage {
switch s {
case "android-sdk":
return nil
case "jdk11-openjdk":
return &mock.Package{
PName: "jdk11-openjdk",
PVersion: "11.0.12.u7-1",
PDB: mock.NewDB("community"),
PProvides: mock.DependList{
Depends: []alpm.Depend{
{Name: "java-environment", Version: "11", Mod: alpm.DepModEq},
{Name: "java-environment-openjdk", Version: "11", Mod: alpm.DepModEq},
{Name: "jdk11-openjdk", Version: "11.0.19.u7-1", Mod: alpm.DepModEq},
},
},
}
case "java-environment":
panic("not supposed to be called")
}
panic("implement me " + s)
},
PackagesFromGroupFn: func(string) []mock.IPackage { return nil },
LocalSatisfierExistsFn: func(s string) bool {
switch s {
case "java-environment":
return false
}
switch s {
case "libxtst", "fontconfig", "freetype2", "lib32-gcc-libs", "lib32-glibc", "libx11", "libxext", "libxrender", "zlib", "gcc-libs":
return true
}
panic("implement me " + s)
},
LocalPackageFn: func(string) mock.IPackage { return nil },
}
mockAUR := &mockaur.MockAUR{GetFn: func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
if query.Needles[0] == "android-sdk" {
jfinFn := getFromFile(t, "testdata/android-sdk.json")
return jfinFn(ctx, query)
}
panic(fmt.Sprintf("implement me %v", query.Needles))
}}
type fields struct {
dbExecutor db.Executor
aurCache aurc.QueryClient
noDeps bool
noCheckDeps bool
}
type args struct {
targets []string
}
tests := []struct {
name string
fields fields
args args
want []map[string]*InstallInfo
wantErr bool
}{
{
name: "explicit dep",
fields: fields{
dbExecutor: mockDB,
aurCache: mockAUR,
noDeps: false,
noCheckDeps: false,
},
args: args{
targets: []string{"android-sdk", "jdk11-openjdk"},
},
want: []map[string]*InstallInfo{
{
"android-sdk": {
Source: AUR,
Reason: Explicit,
Version: "26.1.1-2",
AURBase: ptrString("android-sdk"),
},
},
{
"jdk11-openjdk": {
Source: Sync,
Reason: Explicit,
Version: "11.0.12.u7-1",
SyncDBName: ptrString("community"),
},
},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
g := NewGrapher(tt.fields.dbExecutor,
tt.fields.aurCache, false, true,
tt.fields.noDeps, tt.fields.noCheckDeps, false,
text.NewLogger(io.Discard, io.Discard, &os.File{}, true, "test"))
got, err := g.GraphFromTargets(context.Background(), nil, tt.args.targets)
require.NoError(t, err)
layers := got.TopoSortedLayerMap(nil)
require.EqualValues(t, tt.want, layers, layers)
})
}
}
func TestGrapher_GraphFromAUR_Deps_ceph_bin(t *testing.T) {
mockDB := &mock.DBExecutor{
SyncPackageFn: func(string) mock.IPackage { return nil },
PackagesFromGroupFn: func(string) []mock.IPackage { return []mock.IPackage{} },
SyncSatisfierFn: func(s string) mock.IPackage {
switch s {
case "ceph-bin", "ceph-libs-bin":
return nil
case "ceph", "ceph-libs", "ceph-libs=17.2.6-2":
return nil
}
panic("implement me " + s)
},
LocalSatisfierExistsFn: func(s string) bool {
switch s {
case "ceph-libs", "ceph-libs=17.2.6-2":
return false
case "dep1", "dep2", "dep3", "makedep1", "makedep2", "checkdep1":
return true
}
panic("implement me " + s)
},
LocalPackageFn: func(string) mock.IPackage { return nil },
}
mockAUR := &mockaur.MockAUR{GetFn: func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
mockPkgs := map[string]aur.Pkg{
"ceph-bin": {
Name: "ceph-bin",
PackageBase: "ceph-bin",
Version: "17.2.6-2",
Depends: []string{"ceph-libs=17.2.6-2", "dep1"},
Provides: []string{"ceph=17.2.6-2"},
},
"ceph-libs-bin": {
Name: "ceph-libs-bin",
PackageBase: "ceph-bin",
Version: "17.2.6-2",
Depends: []string{"dep1", "dep2"},
Provides: []string{"ceph-libs=17.2.6-2"},
},
"ceph": {
Name: "ceph",
PackageBase: "ceph",
Version: "17.2.6-2",
Depends: []string{"ceph-libs=17.2.6-2", "dep1"},
MakeDepends: []string{"makedep1"},
CheckDepends: []string{"checkdep1"},
Provides: []string{"ceph=17.2.6-2"},
},
"ceph-libs": {
Name: "ceph-libs",
PackageBase: "ceph",
Version: "17.2.6-2",
Depends: []string{"dep1", "dep2", "dep3"},
MakeDepends: []string{"makedep1", "makedep2"},
CheckDepends: []string{"checkdep1"},
Provides: []string{"ceph-libs=17.2.6-2"},
},
}
pkgs := []aur.Pkg{}
for _, needle := range query.Needles {
if pkg, ok := mockPkgs[needle]; ok {
pkgs = append(pkgs, pkg)
} else {
panic(fmt.Sprintf("implement me %v", needle))
}
}
return pkgs, nil
}}
installInfos := map[string]*InstallInfo{
"ceph-bin exp": {
Source: AUR,
Reason: Explicit,
Version: "17.2.6-2",
AURBase: ptrString("ceph-bin"),
},
"ceph-libs-bin exp": {
Source: AUR,
Reason: Explicit,
Version: "17.2.6-2",
AURBase: ptrString("ceph-bin"),
},
"ceph exp": {
Source: AUR,
Reason: Explicit,
Version: "17.2.6-2",
AURBase: ptrString("ceph"),
},
"ceph-libs exp": {
Source: AUR,
Reason: Explicit,
Version: "17.2.6-2",
AURBase: ptrString("ceph"),
},
"ceph-libs dep": {
Source: AUR,
Reason: Dep,
Version: "17.2.6-2",
AURBase: ptrString("ceph"),
},
}
tests := []struct {
name string
targets []string
wantLayers []map[string]*InstallInfo
wantErr bool
}{
{
name: "ceph-bin ceph-libs-bin",
targets: []string{"ceph-bin", "ceph-libs-bin"},
wantLayers: []map[string]*InstallInfo{
{"ceph-bin": installInfos["ceph-bin exp"]},
{"ceph-libs-bin": installInfos["ceph-libs-bin exp"]},
},
wantErr: false,
},
{
name: "ceph-libs-bin ceph-bin (reversed order)",
targets: []string{"ceph-libs-bin", "ceph-bin"},
wantLayers: []map[string]*InstallInfo{
{"ceph-bin": installInfos["ceph-bin exp"]},
{"ceph-libs-bin": installInfos["ceph-libs-bin exp"]},
},
wantErr: false,
},
{
name: "ceph",
targets: []string{"ceph"},
wantLayers: []map[string]*InstallInfo{
{"ceph": installInfos["ceph exp"]},
{"ceph-libs": installInfos["ceph-libs dep"]},
},
wantErr: false,
},
{
name: "ceph-bin",
targets: []string{"ceph-bin"},
wantLayers: []map[string]*InstallInfo{
{"ceph-bin": installInfos["ceph-bin exp"]},
{"ceph-libs": installInfos["ceph-libs dep"]},
},
wantErr: false,
},
{
name: "ceph-bin ceph-libs",
targets: []string{"ceph-bin", "ceph-libs"},
wantLayers: []map[string]*InstallInfo{
{"ceph-bin": installInfos["ceph-bin exp"]},
{"ceph-libs": installInfos["ceph-libs exp"]},
},
wantErr: false,
},
{
name: "ceph-libs ceph-bin (reversed order)",
targets: []string{"ceph-libs", "ceph-bin"},
wantLayers: []map[string]*InstallInfo{
{"ceph-bin": installInfos["ceph-bin exp"]},
{"ceph-libs": installInfos["ceph-libs exp"]},
},
wantErr: false,
},
{
name: "ceph ceph-libs-bin",
targets: []string{"ceph", "ceph-libs-bin"},
wantLayers: []map[string]*InstallInfo{
{"ceph": installInfos["ceph exp"]},
{"ceph-libs-bin": installInfos["ceph-libs-bin exp"]},
},
wantErr: false,
},
{
name: "ceph-libs-bin ceph (reversed order)",
targets: []string{"ceph-libs-bin", "ceph"},
wantLayers: []map[string]*InstallInfo{
{"ceph": installInfos["ceph exp"]},
{"ceph-libs-bin": installInfos["ceph-libs-bin exp"]},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
g := NewGrapher(mockDB, mockAUR,
false, true, false, false, false,
text.NewLogger(io.Discard, io.Discard, &os.File{}, true, "test"))
got, err := g.GraphFromTargets(context.Background(), nil, tt.targets)
require.NoError(t, err)
layers := got.TopoSortedLayerMap(nil)
require.EqualValues(t, tt.wantLayers, layers, layers)
})
}
}
func TestGrapher_GraphFromAUR_Deps_gourou(t *testing.T) {
mockDB := &mock.DBExecutor{
SyncPackageFn: func(string) mock.IPackage { return nil },
PackagesFromGroupFn: func(string) []mock.IPackage { return []mock.IPackage{} },
SyncSatisfierFn: func(s string) mock.IPackage {
switch s {
case "gourou", "libzip-git":
return nil
case "libzip":
return &mock.Package{
PName: "libzip",
PVersion: "1.9.2-1",
PDB: mock.NewDB("extra"),
}
}
panic("implement me " + s)
},
LocalSatisfierExistsFn: func(s string) bool {
switch s {
case "gourou", "libzip", "libzip-git":
return false
case "dep1", "dep2":
return true
}
panic("implement me " + s)
},
LocalPackageFn: func(string) mock.IPackage { return nil },
}
mockAUR := &mockaur.MockAUR{GetFn: func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
mockPkgs := map[string]aur.Pkg{
"gourou": {
Name: "gourou",
PackageBase: "gourou",
Version: "0.8.1",
Depends: []string{"libzip"},
},
"libzip-git": {
Name: "libzip-git",
PackageBase: "libzip-git",
Version: "1.9.2.r159.gb3ac716c-1",
Depends: []string{"dep1", "dep2"},
Provides: []string{"libzip=1.9.2.r159.gb3ac716c"},
},
}
pkgs := []aur.Pkg{}
for _, needle := range query.Needles {
if pkg, ok := mockPkgs[needle]; ok {
pkgs = append(pkgs, pkg)
} else {
panic(fmt.Sprintf("implement me %v", needle))
}
}
return pkgs, nil
}}
installInfos := map[string]*InstallInfo{
"gourou exp": {
Source: AUR,
Reason: Explicit,
Version: "0.8.1",
AURBase: ptrString("gourou"),
},
"libzip dep": {
Source: Sync,
Reason: Dep,
Version: "1.9.2-1",
SyncDBName: ptrString("extra"),
},
"libzip exp": {
Source: Sync,
Reason: Explicit,
Version: "1.9.2-1",
SyncDBName: ptrString("extra"),
},
"libzip-git exp": {
Source: AUR,
Reason: Explicit,
Version: "1.9.2.r159.gb3ac716c-1",
AURBase: ptrString("libzip-git"),
},
}
tests := []struct {
name string
targets []string
wantLayers []map[string]*InstallInfo
wantErr bool
}{
{
name: "gourou",
targets: []string{"gourou"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou exp"]},
{"libzip": installInfos["libzip dep"]},
},
wantErr: false,
},
{
name: "gourou libzip",
targets: []string{"gourou", "libzip"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou exp"]},
{"libzip": installInfos["libzip exp"]},
},
wantErr: false,
},
{
name: "gourou libzip-git",
targets: []string{"gourou", "libzip-git"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou exp"]},
{"libzip-git": installInfos["libzip-git exp"]},
},
wantErr: false,
},
{
name: "libzip-git gourou (reversed order)",
targets: []string{"libzip-git", "gourou"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou exp"]},
{"libzip-git": installInfos["libzip-git exp"]},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
g := NewGrapher(mockDB, mockAUR,
false, true, false, false, false,
text.NewLogger(io.Discard, io.Discard, &os.File{}, true, "test"))
got, err := g.GraphFromTargets(context.Background(), nil, tt.targets)
require.NoError(t, err)
layers := got.TopoSortedLayerMap(nil)
require.EqualValues(t, tt.wantLayers, layers, layers)
})
}
}
func TestGrapher_GraphFromTargets_ReinstalledDeps(t *testing.T) {
mockDB := &mock.DBExecutor{
SyncPackageFn: func(string) mock.IPackage { return nil },
PackagesFromGroupFn: func(string) []mock.IPackage { return []mock.IPackage{} },
SyncSatisfierFn: func(s string) mock.IPackage {
switch s {
case "gourou":
return nil
case "libzip":
return &mock.Package{
PName: "libzip",
PVersion: "1.9.2-1",
PDB: mock.NewDB("extra"),
}
}
panic("implement me " + s)
},
SatisfierFromDBFn: func(s, s2 string) (mock.IPackage, error) {
if s2 == "extra" {
switch s {
case "libzip":
return &mock.Package{
PName: "libzip",
PVersion: "1.9.2-1",
PDB: mock.NewDB("extra"),
}, nil
}
}
panic("implement me " + s2 + "/" + s)
},
LocalSatisfierExistsFn: func(s string) bool {
switch s {
case "gourou", "libzip":
return true
}
panic("implement me " + s)
},
LocalPackageFn: func(s string) mock.IPackage {
switch s {
case "libzip":
return &mock.Package{
PName: "libzip",
PVersion: "1.9.2-1",
PDB: mock.NewDB("extra"),
PReason: alpm.PkgReasonDepend,
}
case "gourou":
return &mock.Package{
PName: "gourou",
PVersion: "0.8.1",
PDB: mock.NewDB("aur"),
PReason: alpm.PkgReasonDepend,
}
}
return nil
},
}
mockAUR := &mockaur.MockAUR{GetFn: func(ctx context.Context, query *aurc.Query) ([]aur.Pkg, error) {
mockPkgs := map[string]aur.Pkg{
"gourou": {
Name: "gourou",
PackageBase: "gourou",
Version: "0.8.1",
Depends: []string{"libzip"},
},
}
pkgs := []aur.Pkg{}
for _, needle := range query.Needles {
if pkg, ok := mockPkgs[needle]; ok {
pkgs = append(pkgs, pkg)
} else {
panic(fmt.Sprintf("implement me %v", needle))
}
}
return pkgs, nil
}}
installInfos := map[string]*InstallInfo{
"gourou dep": {
Source: AUR,
Reason: Dep,
Version: "0.8.1",
AURBase: ptrString("gourou"),
},
"libzip dep": {
Source: Sync,
Reason: Dep,
Version: "1.9.2-1",
SyncDBName: ptrString("extra"),
},
}
tests := []struct {
name string
targets []string
wantLayers []map[string]*InstallInfo
wantErr bool
}{
{
name: "gourou libzip",
targets: []string{"gourou", "libzip"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou dep"]},
{"libzip": installInfos["libzip dep"]},
},
wantErr: false,
},
{
name: "aur/gourou extra/libzip",
targets: []string{"aur/gourou", "extra/libzip"},
wantLayers: []map[string]*InstallInfo{
{"gourou": installInfos["gourou dep"]},
{"libzip": installInfos["libzip dep"]},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
g := NewGrapher(mockDB, mockAUR,
false, true, false, false, false,
text.NewLogger(io.Discard, io.Discard, &os.File{}, true, "test"))
got, err := g.GraphFromTargets(context.Background(), nil, tt.targets)
require.NoError(t, err)
layers := got.TopoSortedLayerMap(nil)
require.EqualValues(t, tt.wantLayers, layers, layers)
})
}
}

21
pkg/dep/mock/aur.go Normal file
View File

@ -0,0 +1,21 @@
package mock
import (
"context"
"github.com/Jguer/aur"
)
type GetFunc func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error)
type MockAUR struct {
GetFn GetFunc
}
func (m *MockAUR) Get(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
if m.GetFn != nil {
return m.GetFn(ctx, query)
}
panic("implement me")
}

34
pkg/dep/target_handler.go Normal file
View File

@ -0,0 +1,34 @@
package dep
import "github.com/Jguer/yay/v12/pkg/text"
type Target struct {
DB string
Name string
Mod string
Version string
}
func ToTarget(pkg string) Target {
dbName, depString := text.SplitDBFromName(pkg)
name, mod, depVersion := splitDep(depString)
return Target{
DB: dbName,
Name: name,
Mod: mod,
Version: depVersion,
}
}
func (t Target) DepString() string {
return t.Name + t.Mod + t.Version
}
func (t Target) String() string {
if t.DB != "" {
return t.DB + "/" + t.DepString()
}
return t.DepString()
}

3
pkg/dep/testdata/android-sdk.json vendored Normal file
View File

@ -0,0 +1,3 @@
[
{"ID":1055234,"Name":"android-sdk","PackageBaseID":13751,"PackageBase":"android-sdk","Version":"26.1.1-2","Description":"Google Android SDK","URL":"https://developer.android.com/studio/releases/sdk-tools.html","NumVotes":1487,"Popularity":0.802316,"OutOfDate":null,"Maintainer":"dreamingincode","Submitter":null,"FirstSubmitted":1194895596,"LastModified":1647982720,"URLPath":"/cgit/aur.git/snapshot/android-sdk.tar.gz","Depends":["java-environment","libxtst","fontconfig","freetype2","lib32-gcc-libs","lib32-glibc","libx11","libxext","libxrender","zlib","gcc-libs"],"OptDepends":["android-emulator","android-sdk-platform-tools","android-udev"],"License":["custom"],"Keywords":["android","development"]}
]

3
pkg/dep/testdata/jellyfin-server.json vendored Normal file
View File

@ -0,0 +1,3 @@
[
{"ID":1176791,"Name":"jellyfin-server","PackageBaseID":138631,"PackageBase":"jellyfin","Version":"10.8.8-1","Description":"Jellyfin server component","URL":"https://github.com/jellyfin/jellyfin","NumVotes":84,"Popularity":1.272964,"OutOfDate":null,"Maintainer":"z3ntu","Submitter":"z3ntu","FirstSubmitted":1547053171,"LastModified":1669830147,"URLPath":"/cgit/aur.git/snapshot/jellyfin-server.tar.gz","Depends":["dotnet-runtime-6.0","aspnet-runtime-6.0","ffmpeg","sqlite"],"MakeDepends":["dotnet-sdk-6.0","nodejs","npm","git"],"License":["GPL2"]}
]

3
pkg/dep/testdata/jellyfin-web.json vendored Normal file
View File

@ -0,0 +1,3 @@
[
{"ID":1176790,"Name":"jellyfin-web","PackageBaseID":138631,"PackageBase":"jellyfin","Version":"10.8.8-1","Description":"Jellyfin web client","URL":"https://github.com/jellyfin/jellyfin","NumVotes":84,"Popularity":1.272964,"OutOfDate":null,"Maintainer":"z3ntu","Submitter":"z3ntu","FirstSubmitted":1547053171,"LastModified":1669830147,"URLPath":"/cgit/aur.git/snapshot/jellyfin-web.tar.gz","MakeDepends":["dotnet-sdk-6.0","nodejs","npm","git"],"License":["GPL2"]}
]

3
pkg/dep/testdata/jellyfin.json vendored Normal file
View File

@ -0,0 +1,3 @@
[
{"ID":1176789,"Name":"jellyfin","PackageBaseID":138631,"PackageBase":"jellyfin","Version":"10.8.8-1","Description":"The Free Software Media System","URL":"https://github.com/jellyfin/jellyfin","NumVotes":84,"Popularity":1.272964,"OutOfDate":null,"Maintainer":"z3ntu","Submitter":"z3ntu","FirstSubmitted":1547053171,"LastModified":1669830147,"URLPath":"/cgit/aur.git/snapshot/jellyfin.tar.gz","Depends":["jellyfin-web=10.8.8","jellyfin-server=10.8.8"],"MakeDepends":["dotnet-sdk-6.0","nodejs","npm","git"],"License":["GPL2"]}
]

371
pkg/dep/topo/dep.go Normal file
View File

@ -0,0 +1,371 @@
package topo
import (
"fmt"
"strings"
"github.com/Jguer/go-alpm/v2"
)
type (
NodeSet[T comparable] map[T]bool
ProvidesMap[T comparable] map[T]*DependencyInfo[T]
DepMap[T comparable] map[T]NodeSet[T]
)
func (n NodeSet[T]) Slice() []T {
var slice []T
for node := range n {
slice = append(slice, node)
}
return slice
}
type NodeInfo[V any] struct {
Color string
Background string
Value V
}
type DependencyInfo[T comparable] struct {
Provider T
alpm.Depend
}
type CheckFn[T comparable, V any] func(T, V) error
type Graph[T comparable, V any] struct {
nodes NodeSet[T]
// node info map
nodeInfo map[T]*NodeInfo[V]
// `provides` tracks provides -> node.
provides ProvidesMap[T]
// `dependencies` tracks child -> parents.
dependencies DepMap[T]
// `dependents` tracks parent -> children.
dependents DepMap[T]
}
func New[T comparable, V any]() *Graph[T, V] {
return &Graph[T, V]{
nodes: make(NodeSet[T]),
dependencies: make(DepMap[T]),
dependents: make(DepMap[T]),
nodeInfo: make(map[T]*NodeInfo[V]),
provides: make(ProvidesMap[T]),
}
}
func (g *Graph[T, V]) Len() int {
return len(g.nodes)
}
func (g *Graph[T, V]) Exists(node T) bool {
_, ok := g.nodes[node]
return ok
}
func (g *Graph[T, V]) AddNode(node T) {
g.nodes[node] = true
}
func (g *Graph[T, V]) ProvidesExists(provides T) bool {
_, ok := g.provides[provides]
return ok
}
func (g *Graph[T, V]) GetProviderNode(provides T) *DependencyInfo[T] {
return g.provides[provides]
}
func (g *Graph[T, V]) Provides(provides T, depInfo *alpm.Depend, node T) {
g.provides[provides] = &DependencyInfo[T]{
Provider: node,
Depend: *depInfo,
}
}
func (g *Graph[T, V]) ForEach(f CheckFn[T, V]) error {
for node := range g.nodes {
if err := f(node, g.nodeInfo[node].Value); err != nil {
return err
}
}
return nil
}
func (g *Graph[T, V]) SetNodeInfo(node T, nodeInfo *NodeInfo[V]) {
g.nodeInfo[node] = nodeInfo
}
func (g *Graph[T, V]) GetNodeInfo(node T) *NodeInfo[V] {
return g.nodeInfo[node]
}
func (g *Graph[T, V]) DependOn(child, parent T) error {
if child == parent {
return ErrSelfReferential
}
if g.DependsOn(parent, child) {
return ErrCircular
}
g.AddNode(parent)
g.AddNode(child)
// Add edges.
g.dependents.addNodeToNodeset(parent, child)
g.dependencies.addNodeToNodeset(child, parent)
return nil
}
func (g *Graph[T, V]) String() string {
var sb strings.Builder
sb.WriteString("digraph {\n")
sb.WriteString("compound=true;\n")
sb.WriteString("concentrate=true;\n")
sb.WriteString("node [shape = record, ordering=out];\n")
for node := range g.nodes {
extra := ""
if info, ok := g.nodeInfo[node]; ok {
if info.Background != "" || info.Color != "" {
extra = fmt.Sprintf("[color = %s, style = filled, fillcolor = %s]", info.Color, info.Background)
}
}
sb.WriteString(fmt.Sprintf("\t\"%v\"%s;\n", node, extra))
}
for parent, children := range g.dependencies {
for child := range children {
sb.WriteString(fmt.Sprintf("\t\"%v\" -> \"%v\";\n", parent, child))
}
}
sb.WriteString("}")
return sb.String()
}
func (g *Graph[T, V]) DependsOn(child, parent T) bool {
deps := g.Dependencies(child)
_, ok := deps[parent]
return ok
}
func (g *Graph[T, V]) HasDependent(parent, child T) bool {
deps := g.Dependents(parent)
_, ok := deps[child]
return ok
}
// leavesMap returns a map of leaves with the node as key and the node info value as value.
func (g *Graph[T, V]) leavesMap() map[T]V {
leaves := make(map[T]V, 0)
for node := range g.nodes {
if _, ok := g.dependencies[node]; !ok {
nodeInfo := g.GetNodeInfo(node)
if nodeInfo == nil {
nodeInfo = &NodeInfo[V]{}
}
leaves[node] = nodeInfo.Value
}
}
return leaves
}
// TopoSortedLayerMap returns a slice of all of the graph nodes in topological sort order with their node info.
func (g *Graph[T, V]) TopoSortedLayerMap(checkFn CheckFn[T, V]) []map[T]V {
layers := []map[T]V{}
// Copy the graph
shrinkingGraph := g.clone()
for {
leaves := shrinkingGraph.leavesMap()
if len(leaves) == 0 {
break
}
layers = append(layers, leaves)
for leafNode := range leaves {
if checkFn != nil {
if err := checkFn(leafNode, leaves[leafNode]); err != nil {
return nil
}
}
shrinkingGraph.remove(leafNode)
}
}
return layers
}
// returns if it was the last
func (dm DepMap[T]) removeFromDepmap(key, node T) bool {
if nodes := dm[key]; len(nodes) == 1 {
// The only element in the nodeset must be `node`, so we
// can delete the entry entirely.
delete(dm, key)
return true
} else {
// Otherwise, remove the single node from the nodeset.
delete(nodes, node)
return false
}
}
// Prune removes the node,
// its dependencies if there are no other dependents
// and its dependents
func (g *Graph[T, V]) Prune(node T) []T {
pruned := []T{node}
// Remove edges from things that depend on `node`.
for dependent := range g.dependents[node] {
last := g.dependencies.removeFromDepmap(dependent, node)
if last {
pruned = append(pruned, g.Prune(dependent)...)
}
}
delete(g.dependents, node)
// Remove all edges from node to the things it depends on.
for dependency := range g.dependencies[node] {
last := g.dependents.removeFromDepmap(dependency, node)
if last {
pruned = append(pruned, g.Prune(dependency)...)
}
}
delete(g.dependencies, node)
// Finally, remove the node itself.
delete(g.nodes, node)
return pruned
}
func (g *Graph[T, V]) remove(node T) {
// Remove edges from things that depend on `node`.
for dependent := range g.dependents[node] {
g.dependencies.removeFromDepmap(dependent, node)
}
delete(g.dependents, node)
// Remove all edges from node to the things it depends on.
for dependency := range g.dependencies[node] {
g.dependents.removeFromDepmap(dependency, node)
}
delete(g.dependencies, node)
// Finally, remove the node itself.
delete(g.nodes, node)
}
func (g *Graph[T, V]) Dependencies(child T) NodeSet[T] {
return g.buildTransitive(child, g.ImmediateDependencies)
}
func (g *Graph[T, V]) ImmediateDependencies(node T) NodeSet[T] {
return g.dependencies[node]
}
func (g *Graph[T, V]) Dependents(parent T) NodeSet[T] {
return g.buildTransitive(parent, g.immediateDependents)
}
func (g *Graph[T, V]) immediateDependents(node T) NodeSet[T] {
return g.dependents[node]
}
func (g *Graph[T, V]) clone() *Graph[T, V] {
return &Graph[T, V]{
dependencies: g.dependencies.copy(),
dependents: g.dependents.copy(),
nodes: g.nodes.copy(),
nodeInfo: g.nodeInfo, // not copied, as it is not modified
}
}
// buildTransitive starts at `root` and continues calling `nextFn` to keep discovering more nodes until
// the graph cannot produce any more. It returns the set of all discovered nodes.
func (g *Graph[T, V]) buildTransitive(root T, nextFn func(T) NodeSet[T]) NodeSet[T] {
if _, ok := g.nodes[root]; !ok {
return nil
}
out := make(NodeSet[T])
searchNext := []T{root}
for len(searchNext) > 0 {
// List of new nodes from this layer of the dependency graph. This is
// assigned to `searchNext` at the end of the outer "discovery" loop.
discovered := []T{}
for _, node := range searchNext {
// For each node to discover, find the next nodes.
for nextNode := range nextFn(node) {
// If we have not seen the node before, add it to the output as well
// as the list of nodes to traverse in the next iteration.
if _, ok := out[nextNode]; !ok {
out[nextNode] = true
discovered = append(discovered, nextNode)
}
}
}
searchNext = discovered
}
return out
}
func (s NodeSet[T]) copy() NodeSet[T] {
out := make(NodeSet[T], len(s))
for k, v := range s {
out[k] = v
}
return out
}
func (dm DepMap[T]) copy() DepMap[T] {
out := make(DepMap[T], len(dm))
for k := range dm {
out[k] = dm[k].copy()
}
return out
}
func (dm DepMap[T]) addNodeToNodeset(key, node T) {
nodes, ok := dm[key]
if !ok {
nodes = make(NodeSet[T])
dm[key] = nodes
}
nodes[node] = true
}

9
pkg/dep/topo/errors.go Normal file
View File

@ -0,0 +1,9 @@
package topo
import "errors"
var (
ErrSelfReferential = errors.New(" self-referential dependencies not allowed")
ErrConflictingAlias = errors.New(" alias already defined")
ErrCircular = errors.New(" circular dependencies not allowed")
)

View File

@ -6,63 +6,61 @@ import (
"fmt"
"io"
"net/http"
"regexp"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/exe"
)
const (
MaxConcurrentFetch = 20
_urlPackagePath = "%s/raw/packages/%s/trunk/PKGBUILD"
absPackageURL = "https://gitlab.archlinux.org/archlinux/packaging/packages"
)
var (
ErrInvalidRepository = errors.New(gotext.Get("invalid repository"))
ErrABSPackageNotFound = errors.New(gotext.Get("package not found in repos"))
ABSPackageURL = "https://github.com/archlinux/svntogit-packages"
ABSCommunityURL = "https://github.com/archlinux/svntogit-community"
)
func getRepoURL(db string) (string, error) {
switch db {
case "core", "extra", "testing":
return ABSPackageURL, nil
case "community", "multilib", "community-testing", "multilib-testing":
return ABSCommunityURL, nil
}
type regexReplace struct {
repl string
match *regexp.Regexp
}
return "", ErrInvalidRepository
// regex replacements for Gitlab URLs
// info: https://gitlab.archlinux.org/archlinux/devtools/-/blob/6ce666a1669235749c17d5c44d8a24dea4a135da/src/lib/api/gitlab.sh#L84
var gitlabRepl = []regexReplace{
{repl: `$1-$2`, match: regexp.MustCompile(`([a-zA-Z0-9]+)\+([a-zA-Z]+)`)},
{repl: `plus`, match: regexp.MustCompile(`\+`)},
{repl: `-`, match: regexp.MustCompile(`[^a-zA-Z0-9_\-.]`)},
{repl: `-`, match: regexp.MustCompile(`[_\-]{2,}`)},
{repl: `unix-tree`, match: regexp.MustCompile(`^tree$`)},
}
// Return format for pkgbuild
// https://github.com/archlinux/svntogit-community/raw/packages/neovim/trunk/PKGBUILD
func getPackageURL(db, pkgName string) (string, error) {
repoURL, err := getRepoURL(db)
if err != nil {
return "", err
}
return fmt.Sprintf(_urlPackagePath, repoURL, pkgName), err
// https://gitlab.archlinux.org/archlinux/packaging/packages/0ad/-/raw/main/PKGBUILD
func getPackagePKGBUILDURL(pkgName string) string {
return fmt.Sprintf("%s/%s/-/raw/main/PKGBUILD", absPackageURL, convertPkgNameForURL(pkgName))
}
// Return format for pkgbuild repo
// https://github.com/archlinux/svntogit-community.git
func getPackageRepoURL(db string) (string, error) {
repoURL, err := getRepoURL(db)
if err != nil {
return "", err
}
// https://gitlab.archlinux.org/archlinux/packaging/packages/0ad.git
func getPackageRepoURL(pkgName string) string {
return fmt.Sprintf("%s/%s.git", absPackageURL, convertPkgNameForURL(pkgName))
}
return repoURL + ".git", err
// convert pkgName for Gitlab URL path (repo name)
func convertPkgNameForURL(pkgName string) string {
for _, regex := range gitlabRepl {
pkgName = regex.match.ReplaceAllString(pkgName, regex.repl)
}
return pkgName
}
// ABSPKGBUILD retrieves the PKGBUILD file to a dest directory.
func ABSPKGBUILD(httpClient httpRequestDoer, dbName, pkgName string) ([]byte, error) {
packageURL, err := getPackageURL(dbName, pkgName)
if err != nil {
return nil, err
}
packageURL := getPackagePKGBUILDURL(pkgName)
resp, err := httpClient.Get(packageURL)
if err != nil {
@ -85,12 +83,10 @@ func ABSPKGBUILD(httpClient httpRequestDoer, dbName, pkgName string) ([]byte, er
// ABSPKGBUILDRepo retrieves the PKGBUILD repository to a dest directory.
func ABSPKGBUILDRepo(ctx context.Context, cmdBuilder exe.GitCmdBuilder,
dbName, pkgName, dest string, force bool) (bool, error) {
pkgURL, err := getPackageRepoURL(dbName)
if err != nil {
return false, err
}
dbName, pkgName, dest string, force bool,
) (bool, error) {
pkgURL := getPackageRepoURL(pkgName)
return downloadGitRepo(ctx, cmdBuilder, pkgURL,
pkgName, dest, force, "--single-branch", "-b", "packages/"+pkgName)
pkgName, dest, force, "--single-branch")
}

View File

@ -1,3 +1,6 @@
//go:build !integration
// +build !integration
package download
import (
@ -10,7 +13,7 @@ import (
"github.com/stretchr/testify/assert"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/exe"
)
const gitExtrasPKGBUILD = `pkgname=git-extras
@ -47,12 +50,12 @@ func Test_getPackageURL(t *testing.T) {
wantErr bool
}{
{
name: "community package",
name: "extra package",
args: args{
db: "community",
db: "extra",
pkgName: "kitty",
},
want: "https://github.com/archlinux/svntogit-community/raw/packages/kitty/trunk/PKGBUILD",
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/kitty/-/raw/main/PKGBUILD",
wantErr: false,
},
{
@ -61,27 +64,69 @@ func Test_getPackageURL(t *testing.T) {
db: "core",
pkgName: "linux",
},
want: "https://github.com/archlinux/svntogit-packages/raw/packages/linux/trunk/PKGBUILD",
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/linux/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "personal repo package",
args: args{
db: "sweswe",
pkgName: "linux",
pkgName: "zabix",
},
want: "",
wantErr: true,
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/zabix/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "special name +",
args: args{
db: "core",
pkgName: "my+package",
},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "special name %",
args: args{
db: "core",
pkgName: "my%package",
},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "special name _-",
args: args{
db: "core",
pkgName: "my_-package",
},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "special name ++",
args: args{
db: "core",
pkgName: "my++package",
},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/mypluspluspackage/-/raw/main/PKGBUILD",
wantErr: false,
},
{
name: "special name tree",
args: args{
db: "sweswe",
pkgName: "tree",
},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/unix-tree/-/raw/main/PKGBUILD",
wantErr: false,
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
got, err := getPackageURL(tt.args.db, tt.args.pkgName)
if tt.wantErr {
assert.ErrorIs(t, err, ErrInvalidRepository)
}
got := getPackagePKGBUILDURL(tt.args.pkgName)
assert.Equal(t, tt.want, got)
})
}
@ -110,7 +155,7 @@ func TestGetABSPkgbuild(t *testing.T) {
body: gitExtrasPKGBUILD,
status: 200,
pkgName: "git-extras",
wantURL: "https://github.com/archlinux/svntogit-packages/raw/packages/git-extras/trunk/PKGBUILD",
wantURL: "https://gitlab.archlinux.org/archlinux/packaging/packages/git-extras/-/raw/main/PKGBUILD",
},
want: gitExtrasPKGBUILD,
wantErr: false,
@ -122,7 +167,7 @@ func TestGetABSPkgbuild(t *testing.T) {
body: "",
status: 404,
pkgName: "git-git",
wantURL: "https://github.com/archlinux/svntogit-packages/raw/packages/git-git/trunk/PKGBUILD",
wantURL: "https://gitlab.archlinux.org/archlinux/packaging/packages/git-git/-/raw/main/PKGBUILD",
},
want: "",
wantErr: true,
@ -154,7 +199,7 @@ func Test_getPackageRepoURL(t *testing.T) {
t.Parallel()
type args struct {
db string
pkgName string
}
tests := []struct {
name string
@ -163,32 +208,59 @@ func Test_getPackageRepoURL(t *testing.T) {
wantErr bool
}{
{
name: "community package",
args: args{db: "community"},
want: "https://github.com/archlinux/svntogit-community.git",
name: "extra package",
args: args{pkgName: "zoxide"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/zoxide.git",
wantErr: false,
},
{
name: "core package",
args: args{db: "core"},
want: "https://github.com/archlinux/svntogit-packages.git",
args: args{pkgName: "linux"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/linux.git",
wantErr: false,
},
{
name: "personal repo package",
args: args{db: "sweswe"},
want: "",
wantErr: true,
args: args{pkgName: "sweswe"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/sweswe.git",
wantErr: false,
},
{
name: "special name +",
args: args{pkgName: "my+package"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package.git",
wantErr: false,
},
{
name: "special name %",
args: args{pkgName: "my%package"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package.git",
wantErr: false,
},
{
name: "special name _-",
args: args{pkgName: "my_-package"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/my-package.git",
wantErr: false,
},
{
name: "special name ++",
args: args{pkgName: "my++package"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/mypluspluspackage.git",
wantErr: false,
},
{
name: "special name tree",
args: args{pkgName: "tree"},
want: "https://gitlab.archlinux.org/archlinux/packaging/packages/unix-tree.git",
wantErr: false,
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
got, err := getPackageRepoURL(tt.args.db)
if tt.wantErr {
assert.ErrorIs(t, err, ErrInvalidRepository)
}
got := getPackageRepoURL(tt.args.pkgName)
assert.Equal(t, tt.want, got)
})
}
@ -200,13 +272,13 @@ func Test_getPackageRepoURL(t *testing.T) {
func TestABSPKGBUILDRepo(t *testing.T) {
t.Parallel()
cmdRunner := &testRunner{}
want := "/usr/local/bin/git --no-replace-objects -C /tmp/doesnt-exist clone --no-progress --single-branch -b packages/linux https://github.com/archlinux/svntogit-packages.git linux"
want := "/usr/local/bin/git --no-replace-objects -C /tmp/doesnt-exist clone --no-progress --single-branch https://gitlab.archlinux.org/archlinux/packaging/packages/linux.git linux"
if os.Getuid() == 0 {
ld := "systemd-run"
if path, _ := exec.LookPath(ld); path != "" {
ld = path
}
want = fmt.Sprintf("%s --service-type=oneshot --pipe --wait --pty --quiet -p DynamicUser=yes -p CacheDirectory=yay -E HOME=/tmp --no-replace-objects -C /tmp/doesnt-exist clone --no-progress --single-branch -b packages/linux https://github.com/archlinux/svntogit-packages.git linux", ld)
want = fmt.Sprintf("%s --service-type=oneshot --pipe --wait --pty --quiet -p DynamicUser=yes -p CacheDirectory=yay -E HOME=/tmp --no-replace-objects -C /tmp/doesnt-exist clone --no-progress --single-branch https://gitlab.archlinux.org/archlinux/packaging/packages/linux.git linux", ld)
}
cmdBuilder := &testGitBuilder{
@ -219,7 +291,7 @@ func TestABSPKGBUILDRepo(t *testing.T) {
GitFlags: []string{"--no-replace-objects"},
},
}
newClone, err := ABSPKGBUILDRepo(context.TODO(), cmdBuilder, "core", "linux", "/tmp/doesnt-exist", false)
newClone, err := ABSPKGBUILDRepo(context.Background(), cmdBuilder, "core", "linux", "/tmp/doesnt-exist", false)
assert.NoError(t, err)
assert.Equal(t, true, newClone)
}
@ -253,7 +325,7 @@ func TestABSPKGBUILDRepoExistsPerms(t *testing.T) {
GitFlags: []string{"--no-replace-objects"},
},
}
newClone, err := ABSPKGBUILDRepo(context.TODO(), cmdBuilder, "core", "linux", dir, false)
newClone, err := ABSPKGBUILDRepo(context.Background(), cmdBuilder, "core", "linux", dir, false)
assert.NoError(t, err)
assert.Equal(t, false, newClone)
}

View File

@ -10,9 +10,9 @@ import (
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/multierror"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/multierror"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/text"
)
func AURPKGBUILD(httpClient httpRequestDoer, pkgName, aurURL string) ([]byte, error) {
@ -48,8 +48,9 @@ func AURPKGBUILDRepo(ctx context.Context, cmdBuilder exe.GitCmdBuilder, aurURL,
func AURPKGBUILDRepos(
ctx context.Context,
cmdBuilder exe.GitCmdBuilder,
targets []string, aurURL, dest string, force bool) (map[string]bool, error) {
cmdBuilder exe.GitCmdBuilder, logger *text.Logger,
targets []string, aurURL, dest string, force bool,
) (map[string]bool, error) {
cloned := make(map[string]bool, len(targets))
var (
@ -62,30 +63,34 @@ func AURPKGBUILDRepos(
for _, target := range targets {
sem <- 1
wg.Add(1)
go func(target string) {
defer func() {
<-sem
wg.Done()
}()
newClone, err := AURPKGBUILDRepo(ctx, cmdBuilder, aurURL, target, dest, force)
progress := 0
mux.Lock()
progress := len(cloned)
if err != nil {
errs.Add(err)
} else {
mux.Lock()
cloned[target] = newClone
progress = len(cloned)
mux.Unlock()
logger.OperationInfoln(
gotext.Get("(%d/%d) Failed to download PKGBUILD: %s",
progress, len(targets), text.Cyan(target)))
return
}
text.OperationInfoln(
cloned[target] = newClone
progress = len(cloned)
mux.Unlock()
logger.OperationInfoln(
gotext.Get("(%d/%d) Downloaded PKGBUILD: %s",
progress, len(targets), text.Cyan(target)))
<-sem
wg.Done()
}(target)
}

View File

@ -1,3 +1,6 @@
//go:build !integration
// +build !integration
package download
import (
@ -10,7 +13,7 @@ import (
"github.com/stretchr/testify/assert"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/exe"
)
func TestGetAURPkgbuild(t *testing.T) {
@ -98,7 +101,7 @@ func TestAURPKGBUILDRepo(t *testing.T) {
GitFlags: []string{"--no-replace-objects"},
},
}
newCloned, err := AURPKGBUILDRepo(context.TODO(), cmdBuilder, "https://aur.archlinux.org", "yay-bin", "/tmp/doesnt-exist", false)
newCloned, err := AURPKGBUILDRepo(context.Background(), cmdBuilder, "https://aur.archlinux.org", "yay-bin", "/tmp/doesnt-exist", false)
assert.NoError(t, err)
assert.Equal(t, true, newCloned)
}
@ -132,7 +135,7 @@ func TestAURPKGBUILDRepoExistsPerms(t *testing.T) {
GitFlags: []string{"--no-replace-objects"},
},
}
cloned, err := AURPKGBUILDRepo(context.TODO(), cmdBuilder, "https://aur.archlinux.org", "yay-bin", dir, false)
cloned, err := AURPKGBUILDRepo(context.Background(), cmdBuilder, "https://aur.archlinux.org", "yay-bin", dir, false)
assert.NoError(t, err)
assert.Equal(t, false, cloned)
}
@ -155,7 +158,7 @@ func TestAURPKGBUILDRepos(t *testing.T) {
GitFlags: []string{},
},
}
cloned, err := AURPKGBUILDRepos(context.TODO(), cmdBuilder, targets, "https://aur.archlinux.org", dir, false)
cloned, err := AURPKGBUILDRepos(context.Background(), cmdBuilder, newTestLogger(), targets, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
assert.EqualValues(t, map[string]bool{"yay": true, "yay-bin": false, "yay-git": true}, cloned)

View File

@ -9,11 +9,13 @@ import (
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/multierror"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/aur"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/multierror"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
type httpRequestDoer interface {
@ -22,11 +24,12 @@ type httpRequestDoer interface {
type DBSearcher interface {
SyncPackage(string) db.IPackage
SatisfierFromDB(string, string) db.IPackage
SyncPackageFromDB(string, string) db.IPackage
}
func downloadGitRepo(ctx context.Context, cmdBuilder exe.GitCmdBuilder,
pkgURL, pkgName, dest string, force bool, gitArgs ...string) (bool, error) {
pkgURL, pkgName, dest string, force bool, gitArgs ...string,
) (bool, error) {
finalDir := filepath.Join(dest, pkgName)
newClone := true
@ -78,8 +81,9 @@ func getURLName(pkg db.IPackage) string {
return name
}
func PKGBUILDs(dbExecutor DBSearcher, httpClient *http.Client, targets []string,
aurURL string, mode parser.TargetMode) (map[string][]byte, error) {
func PKGBUILDs(dbExecutor DBSearcher, aurClient aur.QueryClient, httpClient *http.Client,
logger *text.Logger, targets []string, aurURL string, mode parser.TargetMode,
) (map[string][]byte, error) {
pkgbuilds := make(map[string][]byte, len(targets))
var (
@ -92,7 +96,7 @@ func PKGBUILDs(dbExecutor DBSearcher, httpClient *http.Client, targets []string,
for _, target := range targets {
// Probably replaceable by something in query.
dbName, name, aur, toSkip := getPackageUsableName(dbExecutor, target, mode)
dbName, name, isAUR, toSkip := getPackageUsableName(dbExecutor, aurClient, logger, target, mode)
if toSkip {
continue
}
@ -123,7 +127,7 @@ func PKGBUILDs(dbExecutor DBSearcher, httpClient *http.Client, targets []string,
<-sem
wg.Done()
}(target, dbName, name, aur)
}(target, dbName, name, isAUR)
}
wg.Wait()
@ -131,9 +135,10 @@ func PKGBUILDs(dbExecutor DBSearcher, httpClient *http.Client, targets []string,
return pkgbuilds, errs.Return()
}
func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher,
cmdBuilder exe.GitCmdBuilder,
targets []string, mode parser.TargetMode, aurURL, dest string, force bool) (map[string]bool, error) {
func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher, aurClient aur.QueryClient,
cmdBuilder exe.GitCmdBuilder, logger *text.Logger,
targets []string, mode parser.TargetMode, aurURL, dest string, force bool,
) (map[string]bool, error) {
cloned := make(map[string]bool, len(targets))
var (
@ -146,7 +151,7 @@ func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher,
for _, target := range targets {
// Probably replaceable by something in query.
dbName, name, aur, toSkip := getPackageUsableName(dbExecutor, target, mode)
dbName, name, isAUR, toSkip := getPackageUsableName(dbExecutor, aurClient, logger, target, mode)
if toSkip {
continue
}
@ -179,11 +184,11 @@ func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher,
}
if aur {
text.OperationInfoln(
logger.OperationInfoln(
gotext.Get("(%d/%d) Downloaded PKGBUILD: %s",
progress, len(targets), text.Cyan(pkgName)))
} else {
text.OperationInfoln(
logger.OperationInfoln(
gotext.Get("(%d/%d) Downloaded PKGBUILD from ABS: %s",
progress, len(targets), text.Cyan(pkgName)))
}
@ -191,7 +196,7 @@ func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher,
<-sem
wg.Done()
}(target, dbName, name, aur)
}(target, dbName, name, isAUR)
}
wg.Wait()
@ -200,34 +205,47 @@ func PKGBUILDRepos(ctx context.Context, dbExecutor DBSearcher,
}
// TODO: replace with dep.ResolveTargets.
func getPackageUsableName(dbExecutor DBSearcher, target string, mode parser.TargetMode) (dbname, pkgname string, aur, toSkip bool) {
aur = true
func getPackageUsableName(dbExecutor DBSearcher, aurClient aur.QueryClient,
logger *text.Logger, target string, mode parser.TargetMode,
) (dbname, pkgname string, isAUR, toSkip bool) {
dbName, name := text.SplitDBFromName(target)
if dbName != "aur" && mode.AtLeastRepo() {
var pkg db.IPackage
if dbName != "" {
pkg = dbExecutor.SatisfierFromDB(name, dbName)
if pkg == nil {
// if the user precised a db but the package is not in the db
// then it is missing
// Mode does not allow AUR packages
return dbName, name, aur, true
}
pkg = dbExecutor.SyncPackageFromDB(name, dbName)
} else {
pkg = dbExecutor.SyncPackage(name)
}
if pkg != nil {
aur = false
name = getURLName(pkg)
dbName = pkg.DB().Name()
return dbName, name, false, false
}
// If the package is not found in the database and it was expected to be
if pkg == nil && dbName != "" {
return dbName, name, true, true
}
}
if aur && mode == parser.ModeRepo {
return dbName, name, aur, true
if mode == parser.ModeRepo {
return dbName, name, true, true
}
return dbName, name, aur, false
pkgs, err := aurClient.Get(context.Background(), &aur.Query{
By: aur.Name,
Contains: false,
Needles: []string{name},
})
if err != nil {
logger.Warnln(err)
return dbName, name, true, true
}
if len(pkgs) == 0 {
return dbName, name, true, true
}
return "aur", name, true, false
}

View File

@ -0,0 +1,106 @@
//go:build integration
// +build integration
package download
import (
"context"
"net/http"
"os"
"strings"
"testing"
"github.com/stretchr/testify/assert"
"github.com/Jguer/aur"
mockaur "github.com/Jguer/yay/v12/pkg/dep/mock"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
func TestIntegrationPKGBUILDReposDefinedDBClone(t *testing.T) {
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
targets := []string{"core/linux", "yay-bin", "yay-git"}
testLogger := text.NewLogger(os.Stdout, os.Stderr, strings.NewReader(""), true, "test")
cmdRunner := &exe.OSRunner{Log: testLogger}
cmdBuilder := &exe.CmdBuilder{
Runner: cmdRunner,
GitBin: "git",
GitFlags: []string{},
Log: testLogger,
}
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"linux": "core"},
}
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, testLogger.Child("test"),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
assert.EqualValues(t, map[string]bool{"core/linux": true, "yay-bin": true, "yay-git": true}, cloned)
}
func TestIntegrationPKGBUILDReposNotExist(t *testing.T) {
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
targets := []string{"core/yay", "yay-bin", "yay-git"}
testLogger := text.NewLogger(os.Stdout, os.Stderr, strings.NewReader(""), true, "test")
cmdRunner := &exe.OSRunner{Log: testLogger}
cmdBuilder := &exe.CmdBuilder{
Runner: cmdRunner,
GitBin: "git",
GitFlags: []string{},
Log: testLogger,
}
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, testLogger.Child("test"),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.Error(t, err)
assert.EqualValues(t, map[string]bool{"yay-bin": true, "yay-git": true}, cloned)
}
// GIVEN 2 aur packages and 1 in repo
// WHEN defining as specified targets
// THEN all aur be found and cloned
func TestIntegrationPKGBUILDFull(t *testing.T) {
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil
},
}
testLogger := text.NewLogger(os.Stdout, os.Stderr, strings.NewReader(""), true, "test")
targets := []string{"core/linux", "aur/yay-bin", "yay-git"}
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"linux": "core"},
}
fetched, err := PKGBUILDs(searcher, mockClient, &http.Client{}, testLogger.Child("test"),
targets, "https://aur.archlinux.org", parser.ModeAny)
assert.NoError(t, err)
for _, target := range targets {
assert.Contains(t, fetched, target)
assert.NotEmpty(t, fetched[target])
}
}

View File

@ -1,19 +1,32 @@
//go:build !integration
// +build !integration
package download
import (
"context"
"io"
"net/http"
"os"
"path/filepath"
"strings"
"testing"
"github.com/stretchr/testify/assert"
"gopkg.in/h2non/gock.v1"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/aur"
mockaur "github.com/Jguer/yay/v12/pkg/dep/mock"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
func newTestLogger() *text.Logger {
return text.NewLogger(io.Discard, io.Discard, strings.NewReader(""), true, "test")
}
// GIVEN 2 aur packages and 1 in repo
// GIVEN package in repo is already present
// WHEN defining package db as a target
@ -22,6 +35,14 @@ func TestPKGBUILDReposDefinedDBPull(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
testLogger := text.NewLogger(os.Stdout, os.Stderr, strings.NewReader(""), true, "test")
os.MkdirAll(filepath.Join(dir, "yay", ".git"), 0o777)
targets := []string{"core/yay", "yay-bin", "yay-git"}
@ -33,13 +54,14 @@ func TestPKGBUILDReposDefinedDBPull(t *testing.T) {
Runner: cmdRunner,
GitBin: "/usr/local/bin/git",
GitFlags: []string{},
Log: testLogger,
},
}
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.TODO(), searcher,
cmdBuilder,
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
@ -53,6 +75,11 @@ func TestPKGBUILDReposDefinedDBClone(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
targets := []string{"core/yay", "yay-bin", "yay-git"}
cmdRunner := &testRunner{}
cmdBuilder := &testGitBuilder{
@ -67,8 +94,8 @@ func TestPKGBUILDReposDefinedDBClone(t *testing.T) {
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.TODO(), searcher,
cmdBuilder,
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
@ -82,6 +109,11 @@ func TestPKGBUILDReposClone(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
targets := []string{"yay", "yay-bin", "yay-git"}
cmdRunner := &testRunner{}
cmdBuilder := &testGitBuilder{
@ -96,8 +128,8 @@ func TestPKGBUILDReposClone(t *testing.T) {
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.TODO(), searcher,
cmdBuilder,
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
@ -111,6 +143,11 @@ func TestPKGBUILDReposNotFound(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil // fakes a package found for all
},
}
targets := []string{"extra/yay", "yay-bin", "yay-git"}
cmdRunner := &testRunner{}
cmdBuilder := &testGitBuilder{
@ -125,8 +162,8 @@ func TestPKGBUILDReposNotFound(t *testing.T) {
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.TODO(), searcher,
cmdBuilder,
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
@ -140,6 +177,11 @@ func TestPKGBUILDReposRepoMode(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{}, nil // fakes a package found for all
},
}
targets := []string{"yay", "yay-bin", "yay-git"}
cmdRunner := &testRunner{}
cmdBuilder := &testGitBuilder{
@ -154,8 +196,8 @@ func TestPKGBUILDReposRepoMode(t *testing.T) {
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.TODO(), searcher,
cmdBuilder,
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeRepo, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
@ -168,6 +210,11 @@ func TestPKGBUILDReposRepoMode(t *testing.T) {
func TestPKGBUILDFull(t *testing.T) {
t.Parallel()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{{}}, nil
},
}
gock.New("https://aur.archlinux.org").
Get("/cgit/aur.git/plain/PKGBUILD").MatchParam("h", "yay-git").
Reply(200).
@ -177,8 +224,8 @@ func TestPKGBUILDFull(t *testing.T) {
Reply(200).
BodyString("example_yay-bin")
gock.New("https://github.com/").
Get("/archlinux/svntogit-packages/raw/packages/yay/trunk/PKGBUILD").
gock.New("https://gitlab.archlinux.org/").
Get("archlinux/packaging/packages/yay/-/raw/main/PKGBUILD").
Reply(200).
BodyString("example_yay")
@ -188,7 +235,7 @@ func TestPKGBUILDFull(t *testing.T) {
absPackagesDB: map[string]string{"yay": "core"},
}
fetched, err := PKGBUILDs(searcher, &http.Client{},
fetched, err := PKGBUILDs(searcher, mockClient, &http.Client{}, newTestLogger(),
targets, "https://aur.archlinux.org", parser.ModeAny)
assert.NoError(t, err)
@ -198,3 +245,37 @@ func TestPKGBUILDFull(t *testing.T) {
"yay-git": []byte("example_yay-git"),
}, fetched)
}
// GIVEN 2 aur packages and 1 in repo
// WHEN aur packages are not found
// only repo should be cloned
func TestPKGBUILDReposMissingAUR(t *testing.T) {
t.Parallel()
dir := t.TempDir()
mockClient := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{}, nil // fakes a package found for all
},
}
targets := []string{"core/yay", "aur/yay-bin", "aur/yay-git"}
cmdRunner := &testRunner{}
cmdBuilder := &testGitBuilder{
index: 0,
test: t,
parentBuilder: &exe.CmdBuilder{
Runner: cmdRunner,
GitBin: "/usr/local/bin/git",
GitFlags: []string{},
},
}
searcher := &testDBSearcher{
absPackagesDB: map[string]string{"yay": "core"},
}
cloned, err := PKGBUILDRepos(context.Background(), searcher, mockClient,
cmdBuilder, newTestLogger(),
targets, parser.ModeAny, "https://aur.archlinux.org", dir, false)
assert.NoError(t, err)
assert.EqualValues(t, map[string]bool{"core/yay": true}, cloned)
}

View File

@ -12,8 +12,8 @@ import (
"github.com/Jguer/go-alpm/v2"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/settings/exe"
)
type testRunner struct{}
@ -102,7 +102,7 @@ func (d *testDBSearcher) SyncPackage(name string) db.IPackage {
return nil
}
func (d *testDBSearcher) SatisfierFromDB(name string, db string) db.IPackage {
func (d *testDBSearcher) SyncPackageFromDB(name string, db string) db.IPackage {
if v, ok := d.absPackagesDB[name]; ok && v == db {
return &testPackage{
name: name,

View File

@ -5,7 +5,7 @@ import (
"strings"
"unicode"
"github.com/Jguer/yay/v11/pkg/stringset"
mapset "github.com/deckarep/golang-set/v2"
)
// IntRange stores a max and min amount for range.
@ -17,10 +17,10 @@ type IntRange struct {
// IntRanges is a slice of IntRange.
type IntRanges []IntRange
func makeIntRange(min, max int) IntRange {
func makeIntRange(minVal, maxVal int) IntRange {
return IntRange{
min,
max,
min: minVal,
max: maxVal,
}
}
@ -42,24 +42,6 @@ func (rs IntRanges) Get(n int) bool {
return false
}
// Min returns min value between a and b.
func Min(a, b int) int {
if a < b {
return a
}
return b
}
// Max returns max value between a and b.
func Max(a, b int) int {
if a < b {
return b
}
return a
}
// ParseNumberMenu parses input for number menus split by spaces or commas
// supports individual selection: 1 2 3 4
// supports range selections: 1-4 10-20
@ -71,11 +53,12 @@ func Max(a, b int) int {
// of course the implementation is up to the caller, this function mearley parses
// the input and organizes it.
func ParseNumberMenu(input string) (include, exclude IntRanges,
otherInclude, otherExclude stringset.StringSet) {
otherInclude, otherExclude mapset.Set[string],
) {
include = make(IntRanges, 0)
exclude = make(IntRanges, 0)
otherInclude = make(stringset.StringSet)
otherExclude = make(stringset.StringSet)
otherInclude = mapset.NewThreadUnsafeSet[string]()
otherExclude = mapset.NewThreadUnsafeSet[string]()
words := strings.FieldsFunc(input, func(c rune) bool {
return unicode.IsSpace(c) || c == ','
@ -101,22 +84,22 @@ func ParseNumberMenu(input string) (include, exclude IntRanges,
num1, err = strconv.Atoi(ranges[0])
if err != nil {
other.Set(strings.ToLower(word))
other.Add(strings.ToLower(word))
continue
}
if len(ranges) == 2 {
num2, err = strconv.Atoi(ranges[1])
if err != nil {
other.Set(strings.ToLower(word))
other.Add(strings.ToLower(word))
continue
}
} else {
num2 = num1
}
mi := Min(num1, num2)
ma := Max(num1, num2)
mi := min(num1, num2)
ma := max(num1, num2)
if !invert {
include = append(include, makeIntRange(mi, ma))

View File

@ -1,9 +1,13 @@
//go:build !integration
// +build !integration
package intrange
import (
"testing"
"github.com/Jguer/yay/v11/pkg/stringset"
mapset "github.com/deckarep/golang-set/v2"
"github.com/stretchr/testify/assert"
)
func TestParseNumberMenu(t *testing.T) {
@ -11,8 +15,8 @@ func TestParseNumberMenu(t *testing.T) {
type result struct {
Include IntRanges
Exclude IntRanges
OtherInclude stringset.StringSet
OtherExclude stringset.StringSet
OtherInclude mapset.Set[string]
OtherExclude mapset.Set[string]
}
inputs := []string{
@ -37,15 +41,15 @@ func TestParseNumberMenu(t *testing.T) {
makeIntRange(3, 3),
makeIntRange(4, 4),
makeIntRange(5, 5),
}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{
makeIntRange(1, 10),
makeIntRange(5, 15),
}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{
makeIntRange(5, 10),
makeIntRange(85, 90),
}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{
IntRanges{
makeIntRange(1, 1),
@ -58,18 +62,18 @@ func TestParseNumberMenu(t *testing.T) {
makeIntRange(38, 40),
makeIntRange(123, 123),
},
make(stringset.StringSet), make(stringset.StringSet),
mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string](),
},
{IntRanges{}, IntRanges{}, stringset.Make("abort", "all", "none"), make(stringset.StringSet)},
{IntRanges{}, IntRanges{}, stringset.Make("a-b"), stringset.Make("abort", "a-b")},
{IntRanges{}, IntRanges{}, stringset.Make("-9223372036854775809-9223372036854775809"), make(stringset.StringSet)},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet("abort", "all", "none"), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet("a-b"), mapset.NewThreadUnsafeSet("abort", "a-b")},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet("-9223372036854775809-9223372036854775809"), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{
makeIntRange(1, 1),
makeIntRange(2, 2),
makeIntRange(3, 3),
makeIntRange(4, 4),
makeIntRange(5, 5),
}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{
makeIntRange(1, 1),
makeIntRange(2, 2),
@ -79,23 +83,20 @@ func TestParseNumberMenu(t *testing.T) {
makeIntRange(6, 6),
makeIntRange(7, 7),
makeIntRange(8, 8),
}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
{IntRanges{}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
{IntRanges{}, IntRanges{}, make(stringset.StringSet), make(stringset.StringSet)},
{IntRanges{}, IntRanges{}, stringset.Make("a", "b", "c", "d", "e"), make(stringset.StringSet)},
}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet[string](), mapset.NewThreadUnsafeSet[string]()},
{IntRanges{}, IntRanges{}, mapset.NewThreadUnsafeSet("a", "b", "c", "d", "e"), mapset.NewThreadUnsafeSet[string]()},
}
for n, in := range inputs {
res := expected[n]
include, exclude, otherInclude, otherExclude := ParseNumberMenu(in)
if !intRangesEqual(include, res.Include) ||
!intRangesEqual(exclude, res.Exclude) ||
!stringset.Equal(otherInclude, res.OtherInclude) ||
!stringset.Equal(otherExclude, res.OtherExclude) {
t.Fatalf("Test %d Failed: Expected: include=%+v exclude=%+v otherInclude=%+v otherExclude=%+v got include=%+v excluive=%+v otherInclude=%+v otherExclude=%+v",
n+1, res.Include, res.Exclude, res.OtherInclude, res.OtherExclude, include, exclude, otherInclude, otherExclude)
}
assert.True(t, intRangesEqual(include, res.Include), "Test %d Failed: Expected: include=%+v got include=%+v", n+1, res.Include, include)
assert.True(t, intRangesEqual(exclude, res.Exclude), "Test %d Failed: Expected: exclude=%+v got exclude=%+v", n+1, res.Exclude, exclude)
assert.True(t, otherInclude.Equal(res.OtherInclude), "Test %d Failed: Expected: otherInclude=%+v got otherInclude=%+v", n+1, res.OtherInclude, otherInclude)
assert.True(t, otherExclude.Equal(res.OtherExclude), "Test %d Failed: Expected: otherExclude=%+v got otherExclude=%+v", n+1, res.OtherExclude, otherExclude)
}
}

View File

@ -2,22 +2,20 @@
package menus
import (
"fmt"
"context"
"io"
"os"
"path/filepath"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/text"
)
func anyExistInCache(buildDir string, bases []dep.Base) bool {
for _, base := range bases {
pkg := base.Pkgbase()
dir := filepath.Join(buildDir, pkg)
func anyExistInCache(pkgbuildDirs map[string]string) bool {
for _, dir := range pkgbuildDirs {
if _, err := os.Stat(dir); !os.IsNotExist(err) {
return true
}
@ -26,14 +24,20 @@ func anyExistInCache(buildDir string, bases []dep.Base) bool {
return false
}
func Clean(cleanMenuOption bool, buildDir string, bases []dep.Base,
installed stringset.StringSet, noConfirm bool, answerClean string) error {
if !(cleanMenuOption && anyExistInCache(buildDir, bases)) {
func CleanFn(ctx context.Context, run *runtime.Runtime, w io.Writer,
pkgbuildDirsByBase map[string]string, installed mapset.Set[string],
) error {
if len(pkgbuildDirsByBase) == 0 {
return nil // no work to do
}
if !anyExistInCache(pkgbuildDirsByBase) {
return nil
}
skipFunc := func(pkg string) bool {
dir := filepath.Join(buildDir, pkg)
dir := pkgbuildDirsByBase[pkg]
// TOFIX: new install engine dir will always exist, check if unclean instead
if _, err := os.Stat(dir); os.IsNotExist(err) {
return true
}
@ -41,18 +45,32 @@ func Clean(cleanMenuOption bool, buildDir string, bases []dep.Base,
return false
}
toClean, errClean := selectionMenu(buildDir, bases, installed, gotext.Get("Packages to cleanBuild?"),
noConfirm, answerClean, skipFunc)
bases := make([]string, 0, len(pkgbuildDirsByBase))
for pkg := range pkgbuildDirsByBase {
bases = append(bases, pkg)
}
toClean, errClean := selectionMenu(run.Logger, pkgbuildDirsByBase, bases, installed,
gotext.Get("Packages to cleanBuild?"),
settings.NoConfirm, run.Cfg.AnswerClean, skipFunc)
if errClean != nil {
return errClean
}
for i, base := range toClean {
dir := filepath.Join(buildDir, base.Pkgbase())
text.OperationInfoln(gotext.Get("Deleting (%d/%d): %s", i+1, len(toClean), text.Cyan(dir)))
dir := pkgbuildDirsByBase[base]
run.Logger.OperationInfoln(gotext.Get("Deleting (%d/%d): %s", i+1, len(toClean), text.Cyan(dir)))
if err := os.RemoveAll(dir); err != nil {
fmt.Fprintln(os.Stderr, err)
if err := run.CmdBuilder.Show(run.CmdBuilder.BuildGitCmd(ctx, dir, "reset", "--hard", "origin/HEAD")); err != nil {
run.Logger.Warnln(gotext.Get("Unable to clean:"), dir)
return err
}
if err := run.CmdBuilder.Show(run.CmdBuilder.BuildGitCmd(ctx, dir, "clean", "-fdx")); err != nil {
run.Logger.Warnln(gotext.Get("Unable to clean:"), dir)
return err
}
}

View File

@ -4,18 +4,17 @@ package menus
import (
"context"
"fmt"
"os"
"path/filepath"
"io"
"strings"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/multierror"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/settings/exe"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/multierror"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/text"
)
const (
@ -23,24 +22,23 @@ const (
gitDiffRefName = "AUR_SEEN"
)
func showPkgbuildDiffs(ctx context.Context, cmdBuilder exe.ICmdBuilder, buildDir string, bases []dep.Base, cloned map[string]bool) error {
func showPkgbuildDiffs(ctx context.Context, cmdBuilder exe.ICmdBuilder, logger *text.Logger,
pkgbuildDirs map[string]string, bases []string,
) error {
var errMulti multierror.MultiError
for _, base := range bases {
pkg := base.Pkgbase()
dir := filepath.Join(buildDir, pkg)
for _, pkg := range bases {
dir := pkgbuildDirs[pkg]
start, err := getLastSeenHash(ctx, cmdBuilder, buildDir, pkg)
start, err := getLastSeenHash(ctx, cmdBuilder, dir)
if err != nil {
errMulti.Add(err)
continue
}
if cloned[pkg] {
start = gitEmptyTree
} else {
hasDiff, err := gitHasDiff(ctx, cmdBuilder, buildDir, pkg)
if start != gitEmptyTree {
hasDiff, err := gitHasDiff(ctx, cmdBuilder, dir)
if err != nil {
errMulti.Add(err)
@ -48,7 +46,7 @@ func showPkgbuildDiffs(ctx context.Context, cmdBuilder exe.ICmdBuilder, buildDir
}
if !hasDiff {
text.Warnln(gotext.Get("%s: No changes -- skipping", text.Cyan(base.String())))
logger.Warnln(gotext.Get("%s: No changes -- skipping", text.Cyan(pkg)))
continue
}
@ -73,12 +71,12 @@ func showPkgbuildDiffs(ctx context.Context, cmdBuilder exe.ICmdBuilder, buildDir
// Check whether or not a diff exists between the last reviewed diff and
// HEAD@{upstream}.
func gitHasDiff(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name string) (bool, error) {
if gitHasLastSeenRef(ctx, cmdBuilder, path, name) {
func gitHasDiff(ctx context.Context, cmdBuilder exe.ICmdBuilder, dir string) (bool, error) {
if gitHasLastSeenRef(ctx, cmdBuilder, dir) {
stdout, stderr, err := cmdBuilder.Capture(
cmdBuilder.BuildGitCmd(ctx, filepath.Join(path, name), "rev-parse", gitDiffRefName, "HEAD@{upstream}"))
cmdBuilder.BuildGitCmd(ctx, dir, "rev-parse", gitDiffRefName, "HEAD@{upstream}"))
if err != nil {
return false, fmt.Errorf("%s%s", stderr, err)
return false, fmt.Errorf("%s%w", stderr, err)
}
lines := strings.Split(stdout, "\n")
@ -87,30 +85,30 @@ func gitHasDiff(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name stri
return lastseen != upstream, nil
}
// If YAY_DIFF_REVIEW does not exists, we have never reviewed a diff for this package
// If AUR_SEEN does not exists, we have never reviewed a diff for this package
// and should display it.
return true, nil
}
// Return wether or not we have reviewed a diff yet. It checks for the existence of
// YAY_DIFF_REVIEW in the git ref-list.
func gitHasLastSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name string) bool {
// Return whether or not we have reviewed a diff yet. It checks for the existence of
// AUR_SEEN in the git ref-list.
func gitHasLastSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, dir string) bool {
_, _, err := cmdBuilder.Capture(
cmdBuilder.BuildGitCmd(ctx,
filepath.Join(path, name), "rev-parse", "--quiet", "--verify", gitDiffRefName))
dir, "rev-parse", "--quiet", "--verify", gitDiffRefName))
return err == nil
}
// Returns the last reviewed hash. If YAY_DIFF_REVIEW exists it will return this hash.
// Returns the last reviewed hash. If AUR_SEEN exists it will return this hash.
// If it does not it will return empty tree as no diff have been reviewed yet.
func getLastSeenHash(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name string) (string, error) {
if gitHasLastSeenRef(ctx, cmdBuilder, path, name) {
func getLastSeenHash(ctx context.Context, cmdBuilder exe.ICmdBuilder, dir string) (string, error) {
if gitHasLastSeenRef(ctx, cmdBuilder, dir) {
stdout, stderr, err := cmdBuilder.Capture(
cmdBuilder.BuildGitCmd(ctx,
filepath.Join(path, name), "rev-parse", gitDiffRefName))
dir, "rev-parse", gitDiffRefName))
if err != nil {
return "", fmt.Errorf("%s %s", stderr, err)
return "", fmt.Errorf("%s %w", stderr, err)
}
lines := strings.Split(stdout, "\n")
@ -121,26 +119,25 @@ func getLastSeenHash(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name
return gitEmptyTree, nil
}
// Update the YAY_DIFF_REVIEW ref to HEAD. We use this ref to determine which diff were
// Update the AUR_SEEN ref to HEAD. We use this ref to determine which diff were
// reviewed by the user.
func gitUpdateSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, path, name string) error {
func gitUpdateSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, dir string) error {
_, stderr, err := cmdBuilder.Capture(
cmdBuilder.BuildGitCmd(ctx,
filepath.Join(path, name), "update-ref", gitDiffRefName, "HEAD"))
dir, "update-ref", gitDiffRefName, "HEAD"))
if err != nil {
return fmt.Errorf("%s %s", stderr, err)
return fmt.Errorf("%s %w", stderr, err)
}
return nil
}
func updatePkgbuildSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, buildDir string, bases []dep.Base) error {
func updatePkgbuildSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, pkgbuildDirs map[string]string, bases []string) error {
var errMulti multierror.MultiError
for _, base := range bases {
pkg := base.Pkgbase()
if err := gitUpdateSeenRef(ctx, cmdBuilder, buildDir, pkg); err != nil {
for _, pkg := range bases {
dir := pkgbuildDirs[pkg]
if err := gitUpdateSeenRef(ctx, cmdBuilder, dir); err != nil {
errMulti.Add(err)
}
}
@ -148,31 +145,35 @@ func updatePkgbuildSeenRef(ctx context.Context, cmdBuilder exe.ICmdBuilder, buil
return errMulti.Return()
}
func Diff(ctx context.Context, cmdBuilder exe.ICmdBuilder,
buildDir string, diffMenuOption bool, bases []dep.Base,
installed stringset.StringSet, cloned map[string]bool, noConfirm bool, diffDefaultAnswer string,
func DiffFn(ctx context.Context, run *runtime.Runtime, w io.Writer,
pkgbuildDirsByBase map[string]string, installed mapset.Set[string],
) error {
if !diffMenuOption {
return nil
if len(pkgbuildDirsByBase) == 0 {
return nil // no work to do
}
toDiff, errMenu := selectionMenu(buildDir, bases, installed, gotext.Get("Diffs to show?"),
noConfirm, diffDefaultAnswer, nil)
bases := make([]string, 0, len(pkgbuildDirsByBase))
for base := range pkgbuildDirsByBase {
bases = append(bases, base)
}
toDiff, errMenu := selectionMenu(run.Logger, pkgbuildDirsByBase, bases, installed, gotext.Get("Diffs to show?"),
settings.NoConfirm, run.Cfg.AnswerDiff, nil)
if errMenu != nil || len(toDiff) == 0 {
return errMenu
}
if errD := showPkgbuildDiffs(ctx, cmdBuilder, buildDir, toDiff, cloned); errD != nil {
if errD := showPkgbuildDiffs(ctx, run.CmdBuilder, run.Logger, pkgbuildDirsByBase, toDiff); errD != nil {
return errD
}
fmt.Println()
run.Logger.Println()
if !text.ContinueTask(os.Stdin, gotext.Get("Proceed with install?"), true, false) {
if !run.Logger.ContinueTask(gotext.Get("Proceed with install?"), true, false) {
return settings.ErrUserAbort{}
}
if errUpd := updatePkgbuildSeenRef(ctx, cmdBuilder, buildDir, toDiff); errUpd != nil {
if errUpd := updatePkgbuildSeenRef(ctx, run.CmdBuilder, pkgbuildDirsByBase, toDiff); errUpd != nil {
return errUpd
}

View File

@ -2,29 +2,30 @@
package menus
import (
"context"
"errors"
"fmt"
"io"
"os"
"os/exec"
"path/filepath"
"strings"
gosrc "github.com/Morganamilo/go-srcinfo"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/text"
)
// Editor returns the preferred system editor.
func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, args []string) {
func editor(log *text.Logger, editorConfig, editorFlags string, noConfirm bool) (editor string, args []string) {
switch {
case editorConfig != "":
editor, err := exec.LookPath(editorConfig)
if err != nil {
fmt.Fprintln(os.Stderr, err)
log.Errorln(err)
} else {
return editor, strings.Fields(editorFlags)
}
@ -34,7 +35,7 @@ func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, ar
if editorArgs := strings.Fields(os.Getenv("VISUAL")); len(editorArgs) != 0 {
editor, err := exec.LookPath(editorArgs[0])
if err != nil {
fmt.Fprintln(os.Stderr, err)
log.Errorln(err)
} else {
return editor, editorArgs[1:]
}
@ -45,7 +46,7 @@ func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, ar
if editorArgs := strings.Fields(os.Getenv("EDITOR")); len(editorArgs) != 0 {
editor, err := exec.LookPath(editorArgs[0])
if err != nil {
fmt.Fprintln(os.Stderr, err)
log.Errorln(err)
} else {
return editor, editorArgs[1:]
}
@ -53,16 +54,15 @@ func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, ar
fallthrough
default:
fmt.Fprintln(os.Stderr)
text.Errorln(gotext.Get("%s is not set", text.Bold(text.Cyan("$EDITOR"))))
text.Warnln(gotext.Get("Add %s or %s to your environment variables", text.Bold(text.Cyan("$EDITOR")), text.Bold(text.Cyan("$VISUAL"))))
log.Errorln("\n", gotext.Get("%s is not set", text.Bold(text.Cyan("$EDITOR"))))
log.Warnln(gotext.Get("Add %s or %s to your environment variables", text.Bold(text.Cyan("$EDITOR")), text.Bold(text.Cyan("$VISUAL"))))
for {
text.Infoln(gotext.Get("Edit PKGBUILD with?"))
log.Infoln(gotext.Get("Edit PKGBUILD with?"))
editorInput, err := text.GetInput("", noConfirm)
editorInput, err := log.GetInput("", noConfirm)
if err != nil {
fmt.Fprintln(os.Stderr, err)
log.Errorln(err)
continue
}
@ -73,7 +73,7 @@ func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, ar
editor, err := exec.LookPath(editorArgs[0])
if err != nil {
fmt.Fprintln(os.Stderr, err)
log.Errorln(err)
continue
}
@ -82,25 +82,26 @@ func editor(editorConfig, editorFlags string, noConfirm bool) (editor string, ar
}
}
func editPkgbuilds(buildDir string, bases []dep.Base, editorConfig,
func editPkgbuilds(log *text.Logger, pkgbuildDirs map[string]string, bases []string, editorConfig,
editorFlags string, srcinfos map[string]*gosrc.Srcinfo, noConfirm bool,
) error {
pkgbuilds := make([]string, 0, len(bases))
for _, base := range bases {
pkg := base.Pkgbase()
dir := filepath.Join(buildDir, pkg)
for _, pkg := range bases {
dir := pkgbuildDirs[pkg]
pkgbuilds = append(pkgbuilds, filepath.Join(dir, "PKGBUILD"))
for _, splitPkg := range srcinfos[pkg].SplitPackages() {
if splitPkg.Install != "" {
pkgbuilds = append(pkgbuilds, filepath.Join(dir, splitPkg.Install))
if srcinfos != nil {
for _, splitPkg := range srcinfos[pkg].SplitPackages() {
if splitPkg.Install != "" {
pkgbuilds = append(pkgbuilds, filepath.Join(dir, splitPkg.Install))
}
}
}
}
if len(pkgbuilds) > 0 {
editor, editorArgs := editor(editorConfig, editorFlags, noConfirm)
editor, editorArgs := editor(log, editorConfig, editorFlags, noConfirm)
editorArgs = append(editorArgs, pkgbuilds...)
editcmd := exec.Command(editor, editorArgs...)
editcmd.Stdin, editcmd.Stdout, editcmd.Stderr = os.Stdin, os.Stdout, os.Stderr
@ -113,27 +114,33 @@ func editPkgbuilds(buildDir string, bases []dep.Base, editorConfig,
return nil
}
func Edit(editMenuOption bool, buildDir string, bases []dep.Base, editorConfig,
editorFlags string, installed stringset.StringSet, srcinfos map[string]*gosrc.Srcinfo,
noConfirm bool, editDefaultAnswer string,
func EditFn(ctx context.Context, run *runtime.Runtime, w io.Writer,
pkgbuildDirsByBase map[string]string, installed mapset.Set[string],
) error {
if !editMenuOption {
return nil
if len(pkgbuildDirsByBase) == 0 {
return nil // no work to do
}
toEdit, errMenu := selectionMenu(buildDir, bases,
installed, gotext.Get("PKGBUILDs to edit?"), noConfirm, editDefaultAnswer, nil)
bases := make([]string, 0, len(pkgbuildDirsByBase))
for pkg := range pkgbuildDirsByBase {
bases = append(bases, pkg)
}
toEdit, errMenu := selectionMenu(run.Logger, pkgbuildDirsByBase, bases, installed,
gotext.Get("PKGBUILDs to edit?"), settings.NoConfirm, run.Cfg.AnswerEdit, nil)
if errMenu != nil || len(toEdit) == 0 {
return errMenu
}
if errEdit := editPkgbuilds(buildDir, toEdit, editorConfig, editorFlags, srcinfos, noConfirm); errEdit != nil {
// TOFIX: remove or use srcinfo data
if errEdit := editPkgbuilds(run.Logger, pkgbuildDirsByBase,
toEdit, run.Cfg.Editor, run.Cfg.EditorFlags, nil, settings.NoConfirm); errEdit != nil {
return errEdit
}
fmt.Println()
run.Logger.Println()
if !text.ContinueTask(os.Stdin, gotext.Get("Proceed with install?"), true, false) {
if !run.Logger.ContinueTask(gotext.Get("Proceed with install?"), true, false) {
return settings.ErrUserAbort{}
}

View File

@ -3,31 +3,31 @@ package menus
import (
"fmt"
"os"
"path/filepath"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/dep"
"github.com/Jguer/yay/v11/pkg/intrange"
"github.com/Jguer/yay/v11/pkg/settings"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/intrange"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/text"
mapset "github.com/deckarep/golang-set/v2"
)
func pkgbuildNumberMenu(buildDir string, bases []dep.Base, installed stringset.StringSet) {
func pkgbuildNumberMenu(logger *text.Logger, pkgbuildDirs map[string]string,
bases []string, installed mapset.Set[string],
) {
toPrint := ""
for n, base := range bases {
pkg := base.Pkgbase()
dir := filepath.Join(buildDir, pkg)
for n, pkgBase := range bases {
dir := pkgbuildDirs[pkgBase]
toPrint += fmt.Sprintf(text.Magenta("%3d")+" %-40s", len(pkgbuildDirs)-n,
text.Bold(pkgBase))
toPrint += fmt.Sprintf(text.Magenta("%3d")+" %-40s", len(bases)-n,
text.Bold(base.String()))
if base.AnyIsInSet(installed) {
if installed.Contains(pkgBase) {
toPrint += text.Bold(text.Green(gotext.Get(" (Installed)")))
}
// TODO: remove or refactor to check if git dir is unclean
if _, err := os.Stat(dir); !os.IsNotExist(err) {
toPrint += text.Bold(text.Green(gotext.Get(" (Build Files Exist)")))
}
@ -35,68 +35,67 @@ func pkgbuildNumberMenu(buildDir string, bases []dep.Base, installed stringset.S
toPrint += "\n"
}
fmt.Print(toPrint)
logger.Print(toPrint)
}
func selectionMenu(buildDir string, bases []dep.Base, installed stringset.StringSet,
message string, noConfirm bool, defaultAnswer string, skipFunc func(string) bool) ([]dep.Base, error) {
selected := make([]dep.Base, 0)
func selectionMenu(logger *text.Logger, pkgbuildDirs map[string]string, bases []string, installed mapset.Set[string],
message string, noConfirm bool, defaultAnswer string, skipFunc func(string) bool,
) ([]string, error) {
selected := make([]string, 0)
pkgbuildNumberMenu(buildDir, bases, installed)
pkgbuildNumberMenu(logger, pkgbuildDirs, bases, installed)
text.Infoln(message)
text.Infoln(gotext.Get("%s [A]ll [Ab]ort [I]nstalled [No]tInstalled or (1 2 3, 1-3, ^4)", text.Cyan(gotext.Get("[N]one"))))
logger.Infoln(message)
logger.Infoln(gotext.Get("%s [A]ll [Ab]ort [I]nstalled [No]tInstalled or (1 2 3, 1-3, ^4)", text.Cyan(gotext.Get("[N]one"))))
selectInput, err := text.GetInput(defaultAnswer, noConfirm)
selectInput, err := logger.GetInput(defaultAnswer, noConfirm)
if err != nil {
return nil, err
}
eInclude, eExclude, eOtherInclude, eOtherExclude := intrange.ParseNumberMenu(selectInput)
eIsInclude := len(eExclude) == 0 && len(eOtherExclude) == 0
eIsInclude := len(eExclude) == 0 && eOtherExclude.Cardinality() == 0
if eOtherInclude.Get("abort") || eOtherInclude.Get("ab") {
if eOtherInclude.Contains("abort") || eOtherInclude.Contains("ab") {
return nil, settings.ErrUserAbort{}
}
if eOtherInclude.Get("n") || eOtherInclude.Get("none") {
if eOtherInclude.Contains("n") || eOtherInclude.Contains("none") {
return selected, nil
}
for i, base := range bases {
pkg := base.Pkgbase()
if skipFunc != nil && skipFunc(pkg) {
for i, pkgBase := range bases {
if skipFunc != nil && skipFunc(pkgBase) {
continue
}
anyInstalled := base.AnyIsInSet(installed)
anyInstalled := installed.Contains(pkgBase)
if !eIsInclude && eExclude.Get(len(bases)-i) {
continue
}
if anyInstalled && (eOtherInclude.Get("i") || eOtherInclude.Get("installed")) {
selected = append(selected, base)
if anyInstalled && (eOtherInclude.Contains("i") || eOtherInclude.Contains("installed")) {
selected = append(selected, pkgBase)
continue
}
if !anyInstalled && (eOtherInclude.Get("no") || eOtherInclude.Get("notinstalled")) {
selected = append(selected, base)
if !anyInstalled && (eOtherInclude.Contains("no") || eOtherInclude.Contains("notinstalled")) {
selected = append(selected, pkgBase)
continue
}
if eOtherInclude.Get("a") || eOtherInclude.Get("all") {
selected = append(selected, base)
if eOtherInclude.Contains("a") || eOtherInclude.Contains("all") {
selected = append(selected, pkgBase)
continue
}
if eIsInclude && (eInclude.Get(len(bases)-i) || eOtherInclude.Get(pkg)) {
selected = append(selected, base)
if eIsInclude && (eInclude.Get(len(bases)-i) || eOtherInclude.Contains(pkgBase)) {
selected = append(selected, pkgBase)
}
if !eIsInclude && (!eExclude.Get(len(bases)-i) && !eOtherExclude.Get(pkg)) {
selected = append(selected, base)
if !eIsInclude && (!eExclude.Get(len(bases)-i) && !eOtherExclude.Contains(pkgBase)) {
selected = append(selected, pkgBase)
}
}

View File

@ -4,15 +4,13 @@ import (
"bytes"
"context"
"encoding/xml"
"fmt"
"html"
"io"
"net/http"
"os"
"strings"
"time"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/text"
)
type item struct {
@ -23,13 +21,13 @@ type item struct {
Creator string `xml:"dc:creator"`
}
func (item *item) print(buildTime time.Time, all, quiet bool) {
func (item *item) printNews(logger *text.Logger, buildTime time.Time, all, quiet bool) {
var fd string
date, err := time.Parse(time.RFC1123Z, item.PubDate)
if err != nil {
fmt.Fprintln(os.Stderr, err)
logger.Errorln(err)
} else {
fd = text.FormatTime(int(date.Unix()))
if !all && !buildTime.IsZero() {
@ -39,11 +37,11 @@ func (item *item) print(buildTime time.Time, all, quiet bool) {
}
}
fmt.Println(text.Bold(text.Magenta(fd)), text.Bold(strings.TrimSpace(item.Title)))
logger.Println(text.Bold(text.Magenta(fd)), text.Bold(strings.TrimSpace(item.Title)))
if !quiet {
desc := strings.TrimSpace(parseNews(item.Description))
fmt.Println(desc)
logger.Println(desc)
}
}
@ -60,7 +58,9 @@ type rss struct {
Channel channel `xml:"channel"`
}
func PrintNewsFeed(ctx context.Context, client *http.Client, cutOffDate time.Time, bottomUp, all, quiet bool) error {
func PrintNewsFeed(ctx context.Context, client *http.Client, logger *text.Logger,
cutOffDate time.Time, bottomUp, all, quiet bool,
) error {
req, err := http.NewRequestWithContext(ctx, http.MethodGet, "https://archlinux.org/feeds/news", http.NoBody)
if err != nil {
return err
@ -87,11 +87,11 @@ func PrintNewsFeed(ctx context.Context, client *http.Client, cutOffDate time.Tim
if bottomUp {
for i := len(rssGot.Channel.Items) - 1; i >= 0; i-- {
rssGot.Channel.Items[i].print(cutOffDate, all, quiet)
rssGot.Channel.Items[i].printNews(logger, cutOffDate, all, quiet)
}
} else {
for i := 0; i < len(rssGot.Channel.Items); i++ {
rssGot.Channel.Items[i].print(cutOffDate, all, quiet)
rssGot.Channel.Items[i].printNews(logger, cutOffDate, all, quiet)
}
}

View File

@ -1,3 +1,6 @@
//go:build !integration
// +build !integration
package news
import (
@ -5,12 +8,15 @@ import (
"io"
"net/http"
"os"
"strings"
"testing"
"time"
"github.com/bradleyjkemp/cupaloy"
"github.com/stretchr/testify/assert"
"gopkg.in/h2non/gock.v1"
"github.com/Jguer/yay/v12/pkg/text"
)
const lastNews = `
@ -121,6 +127,7 @@ func TestPrintNewsFeed(t *testing.T) {
{name: "latest-quiet", args: args{bottomUp: true, cutOffDate: lastNewsTime, all: false, quiet: true}, wantErr: false},
{name: "latest-quiet-topdown", args: args{bottomUp: false, cutOffDate: lastNewsTime, all: false, quiet: true}, wantErr: false},
}
t.Setenv("TZ", "UTC")
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
@ -131,17 +138,16 @@ func TestPrintNewsFeed(t *testing.T) {
defer gock.Off()
rescueStdout := os.Stdout
r, w, _ := os.Pipe()
os.Stdout = w
logger := text.NewLogger(w, w, strings.NewReader(""), false, "logger")
err := PrintNewsFeed(context.TODO(), &http.Client{}, tt.args.cutOffDate, tt.args.bottomUp, tt.args.all, tt.args.quiet)
err := PrintNewsFeed(context.Background(), &http.Client{}, logger,
tt.args.cutOffDate, tt.args.bottomUp, tt.args.all, tt.args.quiet)
assert.NoError(t, err)
w.Close()
out, _ := io.ReadAll(r)
cupaloy.SnapshotT(t, out)
os.Stdout = rescueStdout
})
}
}
@ -160,15 +166,14 @@ func TestPrintNewsFeedSameDay(t *testing.T) {
defer gock.Off()
rescueStdout := os.Stdout
r, w, _ := os.Pipe()
os.Stdout = w
logger := text.NewLogger(w, w, strings.NewReader(""), false, "logger")
err := PrintNewsFeed(context.TODO(), &http.Client{}, lastNewsTime, true, false, false)
err := PrintNewsFeed(context.Background(), &http.Client{}, logger,
lastNewsTime, true, false, false)
assert.NoError(t, err)
w.Close()
out, _ := io.ReadAll(r)
cupaloy.SnapshotT(t, out)
os.Stdout = rescueStdout
}

View File

@ -1,5 +0,0 @@
 -> ABAF11C65A2970B130ABE3C479BE3E4300411886, required by: dummy-1 (dummy-1 dummy-2)
:: Importing keys with gpg...
:: PGP keys need importing:

View File

@ -1,5 +0,0 @@
 -> 487EACC08557AD082088DABA1EB2638FF56C0C53, required by: cower
:: Importing keys with gpg...
:: PGP keys need importing:

View File

@ -1,5 +0,0 @@
 -> C52048C0C0748FEE227D47A2702353E0F7E48EDB, required by: dummy-3
:: Importing keys with gpg...
:: PGP keys need importing:

View File

@ -1,6 +0,0 @@
 -> 11E521D646982372EB577A1F8F0871F202119294, required by: libc++
 -> B6C8F98282B944E3B0D5C2530FC3042E345AD05D, required by: libc++
:: Importing keys with gpg...
:: PGP keys need importing:

View File

@ -1,266 +0,0 @@
package pgp
import (
"bytes"
"context"
"fmt"
"io"
"net/http"
"os"
"path"
"regexp"
"sort"
"strings"
"testing"
"time"
aur "github.com/Jguer/aur"
gosrc "github.com/Morganamilo/go-srcinfo"
"github.com/bradleyjkemp/cupaloy"
"github.com/Jguer/yay/v11/pkg/dep"
)
const (
// The default port used by the PGP key server.
gpgServerPort = 11371
)
func init() {
http.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
regex := regexp.MustCompile(`search=0[xX]([a-fA-F0-9]+)`)
matches := regex.FindStringSubmatch(r.RequestURI)
data := ""
if matches != nil {
data = getPgpKey(matches[1])
}
w.Header().Set("Content-Type", "application/pgp-keys")
_, err := w.Write([]byte(data))
if err != nil {
fmt.Fprintln(os.Stderr, err)
}
})
}
func newPkg(basename string) *aur.Pkg {
return &aur.Pkg{Name: basename, PackageBase: basename}
}
func getPgpKey(key string) string {
var buffer bytes.Buffer
if contents, err := os.ReadFile(path.Join("testdata", key)); err == nil {
buffer.WriteString("-----BEGIN PGP PUBLIC KEY BLOCK-----\n")
buffer.WriteString("Version: SKS 1.1.6\n")
buffer.WriteString("Comment: Hostname: yay\n\n")
buffer.Write(contents)
buffer.WriteString("\n-----END PGP PUBLIC KEY BLOCK-----\n")
}
return buffer.String()
}
func startPgpKeyServer() *http.Server {
srv := &http.Server{Addr: fmt.Sprintf("127.0.0.1:%d", gpgServerPort), ReadHeaderTimeout: 1 * time.Second}
go func() {
err := srv.ListenAndServe()
if err != nil {
fmt.Fprintln(os.Stderr, err)
}
}()
return srv
}
func TestImportKeys(t *testing.T) {
keyringDir := t.TempDir()
server := startPgpKeyServer()
defer func() {
err := server.Shutdown(context.TODO())
if err != nil {
fmt.Fprintln(os.Stderr, err)
}
}()
casetests := []struct {
keys []string
wantError bool
}{
// Single key, should succeed.
// C52048C0C0748FEE227D47A2702353E0F7E48EDB: Thomas Dickey.
{
keys: []string{"C52048C0C0748FEE227D47A2702353E0F7E48EDB"},
wantError: false,
},
// Two keys, should succeed as well.
// 11E521D646982372EB577A1F8F0871F202119294: Tom Stellard.
// B6C8F98282B944E3B0D5C2530FC3042E345AD05D: Hans Wennborg.
{
keys: []string{
"11E521D646982372EB577A1F8F0871F202119294",
"B6C8F98282B944E3B0D5C2530FC3042E345AD05D",
},
wantError: false,
},
// Single invalid key, should fail.
{
keys: []string{"THIS-SHOULD-FAIL"},
wantError: true,
},
// Two invalid keys, should fail.
{
keys: []string{"THIS-SHOULD-FAIL", "THIS-ONE-SHOULD-FAIL-TOO"},
wantError: true,
},
// Invalid + valid key. Should fail as well.
// 647F28654894E3BD457199BE38DBBDC86092693E: Greg Kroah-Hartman.
{
keys: []string{
"THIS-SHOULD-FAIL",
"647F28654894E3BD457199BE38DBBDC86092693E",
},
wantError: true,
},
}
for _, tt := range casetests {
err := importKeys(tt.keys, "gpg", fmt.Sprintf("--homedir %s --keyserver 127.0.0.1", keyringDir))
if !tt.wantError {
if err != nil {
t.Fatalf("Got error %q, want no error", err)
}
continue
}
// Here, we want to see the error.
if err == nil {
t.Fatalf("Got no error; want error")
}
}
}
func makeSrcinfo(pkgbase string, pgpkeys ...string) *gosrc.Srcinfo {
srcinfo := gosrc.Srcinfo{}
srcinfo.Pkgbase = pkgbase
srcinfo.ValidPGPKeys = pgpkeys
return &srcinfo
}
func TestCheckPgpKeys(t *testing.T) {
keyringDir := t.TempDir()
server := startPgpKeyServer()
defer func() {
err := server.Shutdown(context.TODO())
if err != nil {
fmt.Fprintln(os.Stderr, err)
}
}()
casetests := []struct {
name string
pkgs dep.Base
srcinfos map[string]*gosrc.Srcinfo
wantError bool
}{
// cower: single package, one valid key not yet in the keyring.
// 487EACC08557AD082088DABA1EB2638FF56C0C53: Dave Reisner.
{
name: " one valid key not yet in the keyring",
pkgs: dep.Base{newPkg("cower")},
srcinfos: map[string]*gosrc.Srcinfo{"cower": makeSrcinfo("cower", "487EACC08557AD082088DABA1EB2638FF56C0C53")},
wantError: false,
},
// libc++: single package, two valid keys not yet in the keyring.
// 11E521D646982372EB577A1F8F0871F202119294: Tom Stellard.
// B6C8F98282B944E3B0D5C2530FC3042E345AD05D: Hans Wennborg.
{
name: "two valid keys not yet in the keyring",
pkgs: dep.Base{newPkg("libc++")},
srcinfos: map[string]*gosrc.Srcinfo{
"libc++": makeSrcinfo("libc++", "11E521D646982372EB577A1F8F0871F202119294", "B6C8F98282B944E3B0D5C2530FC3042E345AD05D"),
},
wantError: false,
},
// Two dummy packages requiring the same key.
// ABAF11C65A2970B130ABE3C479BE3E4300411886: Linus Torvalds.
{
name: "Two dummy packages requiring the same key",
pkgs: dep.Base{newPkg("dummy-1"), newPkg("dummy-2")},
srcinfos: map[string]*gosrc.Srcinfo{
"dummy-1": makeSrcinfo("dummy-1",
"ABAF11C65A2970B130ABE3C479BE3E4300411886"),
"dummy-2": makeSrcinfo("dummy-2", "ABAF11C65A2970B130ABE3C479BE3E4300411886"),
},
wantError: false,
},
// dummy package: single package, two valid keys, one of them already
// in the keyring.
// 11E521D646982372EB577A1F8F0871F202119294: Tom Stellard.
// C52048C0C0748FEE227D47A2702353E0F7E48EDB: Thomas Dickey.
{
name: "one already in keyring",
pkgs: dep.Base{newPkg("dummy-3")},
srcinfos: map[string]*gosrc.Srcinfo{
"dummy-3": makeSrcinfo("dummy-3", "11E521D646982372EB577A1F8F0871F202119294", "C52048C0C0748FEE227D47A2702353E0F7E48EDB"),
},
wantError: false,
},
// Two dummy packages with existing keys.
{
name: "two existing",
pkgs: dep.Base{newPkg("dummy-4"), newPkg("dummy-5")},
srcinfos: map[string]*gosrc.Srcinfo{
"dummy-4": makeSrcinfo("dummy-4", "11E521D646982372EB577A1F8F0871F202119294"),
"dummy-5": makeSrcinfo("dummy-5", "C52048C0C0748FEE227D47A2702353E0F7E48EDB"),
},
wantError: false,
},
// Dummy package with invalid key, should fail.
{
name: "one invalid",
pkgs: dep.Base{newPkg("dummy-7")},
srcinfos: map[string]*gosrc.Srcinfo{"dummy-7": makeSrcinfo("dummy-7", "THIS-SHOULD-FAIL")},
wantError: true,
},
// Dummy package with both an invalid an another valid key, should fail.
// A314827C4E4250A204CE6E13284FC34C8E4B1A25: Thomas Bächler.
{
name: "one invalid, one valid",
pkgs: dep.Base{newPkg("dummy-8")},
srcinfos: map[string]*gosrc.Srcinfo{"dummy-8": makeSrcinfo("dummy-8", "A314827C4E4250A204CE6E13284FC34C8E4B1A25", "THIS-SHOULD-FAIL")},
wantError: true,
},
}
for _, tt := range casetests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
rescueStdout := os.Stdout
r, w, _ := os.Pipe()
os.Stdout = w
err := CheckPgpKeys([]dep.Base{tt.pkgs}, tt.srcinfos, "gpg",
fmt.Sprintf("--homedir %s --keyserver 127.0.0.1", keyringDir), true)
if !tt.wantError {
if err != nil {
t.Fatalf("Got error %q, want no error", err)
}
w.Close()
out, _ := io.ReadAll(r)
os.Stdout = rescueStdout
splitLines := strings.Split(string(out), "\n")
sort.Strings(splitLines)
cupaloy.SnapshotT(t, strings.Join(splitLines, "\n"))
return
}
// Here, we want to see the error.
if err == nil {
t.Fatalf("Got no error; want error")
}
})
}
}

View File

@ -1,100 +0,0 @@
package query
import (
"context"
"sync"
"github.com/Jguer/aur"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/intrange"
"github.com/Jguer/yay/v11/pkg/multierror"
"github.com/Jguer/yay/v11/pkg/text"
)
type Pkg = aur.Pkg
// Queries the aur for information about specified packages.
// All packages should be queried in a single aur request except when the number
// of packages exceeds the number set in config.RequestSplitN.
// If the number does exceed config.RequestSplitN multiple aur requests will be
// performed concurrently.
func AURInfo(ctx context.Context, aurClient aur.ClientInterface, names []string, warnings *AURWarnings, splitN int) ([]*Pkg, error) {
info := make([]*Pkg, 0, len(names))
seen := make(map[string]int)
var (
mux sync.Mutex
wg sync.WaitGroup
errs multierror.MultiError
)
makeRequest := func(n, max int) {
defer wg.Done()
tempInfo, requestErr := aurClient.Info(ctx, names[n:max])
if requestErr != nil {
errs.Add(requestErr)
return
}
mux.Lock()
for i := range tempInfo {
info = append(info, &tempInfo[i])
}
mux.Unlock()
}
for n := 0; n < len(names); n += splitN {
max := intrange.Min(len(names), n+splitN)
wg.Add(1)
go makeRequest(n, max)
}
wg.Wait()
if err := errs.Return(); err != nil {
return info, err
}
for k, pkg := range info {
seen[pkg.Name] = k
}
for _, name := range names {
i, ok := seen[name]
if !ok && !warnings.Ignore.Get(name) {
warnings.Missing = append(warnings.Missing, name)
continue
}
pkg := info[i]
if pkg.Maintainer == "" && !warnings.Ignore.Get(name) {
warnings.Orphans = append(warnings.Orphans, name)
}
if pkg.OutOfDate != 0 && !warnings.Ignore.Get(name) {
warnings.OutOfDate = append(warnings.OutOfDate, name)
}
}
return info, nil
}
func AURInfoPrint(ctx context.Context, aurClient aur.ClientInterface, names []string, splitN int) ([]*Pkg, error) {
text.OperationInfoln(gotext.Get("Querying AUR..."))
warnings := &AURWarnings{}
info, err := AURInfo(ctx, aurClient, names, warnings, splitN)
if err != nil {
return info, err
}
warnings.Print()
return info, nil
}

View File

@ -1,47 +1,92 @@
package query
import (
"fmt"
"strings"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/aur"
"github.com/Jguer/go-alpm/v2"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/text"
)
type AURWarnings struct {
Orphans []string
OutOfDate []string
Missing []string
Ignore stringset.StringSet
Orphans []string
OutOfDate []string
Missing []string
LocalNewer []string
log *text.Logger
}
func NewWarnings() *AURWarnings {
return &AURWarnings{Ignore: make(stringset.StringSet)}
func NewWarnings(logger *text.Logger) *AURWarnings {
return &AURWarnings{log: logger}
}
func (warnings *AURWarnings) AddToWarnings(remote map[string]alpm.IPackage, aurPkg *aur.Pkg) {
name := aurPkg.Name
pkg, ok := remote[name]
if !ok {
return
}
if aurPkg.Maintainer == "" && !pkg.ShouldIgnore() {
warnings.Orphans = append(warnings.Orphans, name)
}
if aurPkg.OutOfDate != 0 && !pkg.ShouldIgnore() {
warnings.OutOfDate = append(warnings.OutOfDate, name)
}
if !pkg.ShouldIgnore() && !isDevelPackage(pkg) && db.VerCmp(pkg.Version(), aurPkg.Version) > 0 {
left, right := GetVersionDiff(pkg.Version(), aurPkg.Version)
newerMsg := gotext.Get("%s: local (%s) is newer than AUR (%s)",
text.Cyan(name),
left, right,
)
warnings.LocalNewer = append(warnings.LocalNewer, newerMsg)
}
}
func (warnings *AURWarnings) CalculateMissing(remoteNames []string,
remote map[string]alpm.IPackage, aurData map[string]*aur.Pkg,
) {
for _, name := range remoteNames {
if _, ok := aurData[name]; !ok && !remote[name].ShouldIgnore() {
if _, ok := aurData[strings.TrimSuffix(name, "-debug")]; !ok {
warnings.Missing = append(warnings.Missing, name)
}
}
}
}
func (warnings *AURWarnings) Print() {
normalMissing, debugMissing := filterDebugPkgs(warnings.Missing)
if len(normalMissing) > 0 {
text.Warn(gotext.Get("Missing AUR Packages:"))
printRange(normalMissing)
warnings.log.Warnln(gotext.Get("Packages not in AUR:"), formatNames(normalMissing))
}
if len(debugMissing) > 0 {
text.Warn(gotext.Get("Missing AUR Debug Packages:"))
printRange(debugMissing)
warnings.log.Warnln(gotext.Get("Missing AUR Debug Packages:"), formatNames(debugMissing))
}
if len(warnings.Orphans) > 0 {
text.Warn(gotext.Get("Orphaned AUR Packages:"))
printRange(warnings.Orphans)
warnings.log.Warnln(gotext.Get("Orphan (unmaintained) AUR Packages:"), formatNames(warnings.Orphans))
}
if len(warnings.OutOfDate) > 0 {
text.Warn(gotext.Get("Flagged Out Of Date AUR Packages:"))
printRange(warnings.OutOfDate)
warnings.log.Warnln(gotext.Get("Flagged Out Of Date AUR Packages:"), formatNames(warnings.OutOfDate))
}
if len(warnings.LocalNewer) > 0 {
for _, newer := range warnings.LocalNewer {
warnings.log.Warnln(newer)
}
}
}
@ -60,10 +105,6 @@ func filterDebugPkgs(names []string) (normal, debug []string) {
return
}
func printRange(names []string) {
for _, name := range names {
fmt.Print(" " + text.Cyan(name))
}
fmt.Println()
func formatNames(names []string) string {
return " " + text.Cyan(strings.Join(names, " "))
}

View File

@ -3,53 +3,23 @@ package query
import (
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
// GetPackageNamesBySource returns package names with and without correspondence in SyncDBS respectively.
func GetPackageNamesBySource(dbExecutor db.Executor) (local, remote []string, err error) {
for _, localpkg := range dbExecutor.LocalPackages() {
pkgName := localpkg.Name()
if dbExecutor.SyncPackage(pkgName) != nil {
local = append(local, pkgName)
} else {
remote = append(remote, pkgName)
}
}
return local, remote, err
}
// GetRemotePackages returns packages with no correspondence in SyncDBS.
func GetRemotePackages(dbExecutor db.Executor) (
remote []db.IPackage,
remoteNames []string) {
for _, localpkg := range dbExecutor.LocalPackages() {
pkgName := localpkg.Name()
if dbExecutor.SyncPackage(pkgName) == nil {
remote = append(remote, localpkg)
remoteNames = append(remoteNames, pkgName)
}
}
return remote, remoteNames
}
func RemoveInvalidTargets(targets []string, mode parser.TargetMode) []string {
func RemoveInvalidTargets(logger *text.Logger, targets []string, mode parser.TargetMode) []string {
filteredTargets := make([]string, 0)
for _, target := range targets {
dbName, _ := text.SplitDBFromName(target)
if dbName == "aur" && !mode.AtLeastAUR() {
text.Warnln(gotext.Get("%s: can't use target with option --repo -- skipping", text.Cyan(target)))
logger.Warnln(gotext.Get("%s: can't use target with option --repo -- skipping", text.Cyan(target)))
continue
}
if dbName != "aur" && dbName != "" && !mode.AtLeastRepo() {
text.Warnln(gotext.Get("%s: can't use target with option --aur -- skipping", text.Cyan(target)))
logger.Warnln(gotext.Get("%s: can't use target with option --aur -- skipping", text.Cyan(target)))
continue
}

89
pkg/query/metric.go Normal file
View File

@ -0,0 +1,89 @@
package query
import (
"hash/fnv"
"strings"
"github.com/adrg/strutil"
)
const minVotes = 30
// TODO: Add support for Popularity and LastModified
func (a *abstractResults) aurSortByMetric(pkg *abstractResult) float64 {
return 1 - (minVotes / (minVotes + float64(pkg.votes)))
}
func (a *abstractResults) GetMetric(pkg *abstractResult) float64 {
if v, ok := a.distanceCache[pkg.name]; ok {
return v
}
if strings.EqualFold(pkg.name, a.search) {
return 1.0
}
sim := strutil.Similarity(pkg.name, a.search, a.metric)
for _, prov := range pkg.provides {
// If the package provides search, it's a perfect match
// AUR packages don't populate provides
candidate := strutil.Similarity(prov, a.search, a.metric) * 0.80
if candidate > sim {
sim = candidate
}
}
simDesc := strutil.Similarity(pkg.description, a.search, a.metric)
// slightly overweight sync sources by always giving them max popularity
popularity := 1.0
if pkg.source == sourceAUR {
popularity = a.aurSortByMetric(pkg)
}
sim = sim*0.5 + simDesc*0.2 + popularity*0.3
a.distanceCache[pkg.name] = sim
return sim
}
func (a *abstractResults) separateSourceScore(source string, score float64) float64 {
if !a.separateSources {
return 0
}
if score == 1.0 {
return 50
}
switch source {
case sourceAUR:
return 0
case "core":
return 40
case "extra":
return 30
case "community":
return 20
case "multilib":
return 10
}
if v, ok := a.separateSourceCache[source]; ok {
return v
}
h := fnv.New32a()
h.Write([]byte(source))
sourceScore := float64(int(h.Sum32())%9 + 2)
a.separateSourceCache[source] = sourceScore
return sourceScore
}
func (a *abstractResults) calculateMetric(pkg *abstractResult) float64 {
score := a.GetMetric(pkg)
return a.separateSourceScore(pkg.source, score) + score
}

View File

@ -1,61 +0,0 @@
package query
import (
"context"
"strings"
"testing"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/aur"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestMixedSourceQueryBuilder(t *testing.T) {
t.Parallel()
type testCase struct {
desc string
bottomUp bool
want string
}
testCases := []testCase{
{desc: "bottomup", bottomUp: true, want: "\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n"},
{
desc: "topdown", bottomUp: false,
want: "\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
}
for _, tc := range testCases {
t.Run(tc.desc, func(t *testing.T) {
queryBuilder := NewMixedSourceQueryBuilder("votes", parser.ModeAny, "", tc.bottomUp, false)
search := []string{"linux"}
mockStore := &mockDB{}
client, err := aur.NewClient(aur.WithHTTPClient(&mockDoer{}))
require.NoError(t, err)
queryBuilder.Execute(context.Background(), mockStore, client, search)
assert.Len(t, queryBuilder.results, 3)
assert.Equal(t, 3, queryBuilder.Len())
if tc.bottomUp {
assert.Equal(t, "linux-ck", queryBuilder.results[0].name)
assert.Equal(t, "linux-zen", queryBuilder.results[1].name)
assert.Equal(t, "linux", queryBuilder.results[2].name)
} else {
assert.Equal(t, "linux-ck", queryBuilder.results[2].name)
assert.Equal(t, "linux-zen", queryBuilder.results[1].name)
assert.Equal(t, "linux", queryBuilder.results[0].name)
}
w := &strings.Builder{}
queryBuilder.Results(w, mockStore, Detailed)
wString := w.String()
require.GreaterOrEqual(t, len(wString), 1)
assert.Equal(t, tc.want, wString)
})
}
}

View File

@ -2,35 +2,43 @@ package query
import (
"context"
"fmt"
"io"
"sort"
"strconv"
"strings"
"unicode"
"github.com/Jguer/aur"
"github.com/Jguer/go-alpm/v2"
"github.com/adrg/strutil"
"github.com/adrg/strutil/metrics"
mapset "github.com/deckarep/golang-set/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/intrange"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/intrange"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
)
const sourceAUR = "aur"
type SearchVerbosity int
// Verbosity settings for search.
const (
NumberMenu SearchVerbosity = iota
Detailed
Minimal
)
type Builder interface {
Len() int
Execute(ctx context.Context, dbExecutor db.Executor, aurClient aur.ClientInterface, pkgS []string)
Results(w io.Writer, dbExecutor db.Executor, verboseSearch SearchVerbosity) error
GetTargets(include, exclude intrange.IntRanges, otherExclude stringset.StringSet) ([]string, error)
Execute(ctx context.Context, dbExecutor db.Executor, pkgS []string)
Results(dbExecutor db.Executor, verboseSearch SearchVerbosity) error
GetTargets(include, exclude intrange.IntRanges, otherExclude mapset.Set[string]) ([]string, error)
}
type MixedSourceQueryBuilder struct {
type SourceQueryBuilder struct {
results []abstractResult
sortBy string
searchBy string
@ -38,21 +46,31 @@ type MixedSourceQueryBuilder struct {
queryMap map[string]map[string]interface{}
bottomUp bool
singleLineResults bool
separateSources bool
aurClient aur.QueryClient
logger *text.Logger
}
func NewMixedSourceQueryBuilder(
func NewSourceQueryBuilder(
aurClient aur.QueryClient,
logger *text.Logger,
sortBy string,
targetMode parser.TargetMode,
searchBy string,
bottomUp,
singleLineResults bool,
) *MixedSourceQueryBuilder {
return &MixedSourceQueryBuilder{
separateSources bool,
) *SourceQueryBuilder {
return &SourceQueryBuilder{
aurClient: aurClient,
logger: logger,
bottomUp: bottomUp,
sortBy: sortBy,
targetMode: targetMode,
searchBy: searchBy,
singleLineResults: singleLineResults,
separateSources: separateSources,
queryMap: map[string]map[string]interface{}{},
results: make([]abstractResult, 0, 100),
}
@ -67,81 +85,71 @@ type abstractResult struct {
}
type abstractResults struct {
results []abstractResult
search string
distanceCache map[string]float64
bottomUp bool
metric strutil.StringMetric
results []abstractResult
search string
bottomUp bool
metric strutil.StringMetric
separateSources bool
sortBy string
distanceCache map[string]float64
separateSourceCache map[string]float64
}
func (a *abstractResults) Len() int { return len(a.results) }
func (a *abstractResults) Swap(i, j int) { a.results[i], a.results[j] = a.results[j], a.results[i] }
func (a *abstractResults) GetMetric(pkg *abstractResult) float64 {
if v, ok := a.distanceCache[pkg.name]; ok {
return v
}
sim := strutil.Similarity(pkg.name, a.search, a.metric)
for _, prov := range pkg.provides {
// If the package provides search, it's a perfect match
// AUR packages don't populate provides
candidate := strutil.Similarity(prov, a.search, a.metric)
if candidate > sim {
sim = candidate
}
}
simDesc := strutil.Similarity(pkg.description, a.search, a.metric)
// slightly overweight sync sources by always giving them max popularity
popularity := 1.0
if pkg.source == sourceAUR {
popularity = float64(pkg.votes) / float64(pkg.votes+60)
}
sim = sim*0.6 + simDesc*0.2 + popularity*0.2
a.distanceCache[pkg.name] = sim
return sim
}
func (a *abstractResults) Less(i, j int) bool {
pkgA := a.results[i]
pkgB := a.results[j]
simA := a.GetMetric(&pkgA)
simB := a.GetMetric(&pkgB)
var cmpResult bool
if a.bottomUp {
return simA < simB
switch a.sortBy {
case "name":
cmpResult = !text.LessRunes([]rune(pkgA.name), []rune(pkgB.name))
if a.separateSources {
cmpSources := strings.Compare(pkgA.source, pkgB.source)
if cmpSources != 0 {
cmpResult = cmpSources > 0
}
}
default:
simA := a.calculateMetric(&pkgA)
simB := a.calculateMetric(&pkgB)
cmpResult = simA > simB
}
return simA > simB
if a.bottomUp {
cmpResult = !cmpResult
}
return cmpResult
}
func (s *MixedSourceQueryBuilder) Execute(ctx context.Context, dbExecutor db.Executor, aurClient aur.ClientInterface, pkgS []string) {
func (s *SourceQueryBuilder) Execute(ctx context.Context, dbExecutor db.Executor, pkgS []string) {
var aurErr error
pkgS = RemoveInvalidTargets(pkgS, s.targetMode)
pkgS = RemoveInvalidTargets(s.logger, pkgS, s.targetMode)
metric := &metrics.JaroWinkler{
metric := &metrics.Hamming{
CaseSensitive: false,
}
sortableResults := &abstractResults{
results: []abstractResult{},
search: strings.Join(pkgS, ""),
distanceCache: map[string]float64{},
bottomUp: s.bottomUp,
metric: metric,
results: []abstractResult{},
search: strings.Join(pkgS, ""),
bottomUp: s.bottomUp,
metric: metric,
separateSources: s.separateSources,
sortBy: s.sortBy,
distanceCache: map[string]float64{},
separateSourceCache: map[string]float64{},
}
if s.targetMode.AtLeastAUR() {
var aurResults aurQuery
aurResults, aurErr = queryAUR(ctx, aurClient, pkgS, s.searchBy)
var aurResults []aur.Pkg
aurResults, aurErr = queryAUR(ctx, s.aurClient, pkgS, s.searchBy)
dbName := sourceAUR
for i := range aurResults {
@ -149,6 +157,12 @@ func (s *MixedSourceQueryBuilder) Execute(ctx context.Context, dbExecutor db.Exe
s.queryMap[dbName] = map[string]interface{}{}
}
by := getSearchBy(s.searchBy)
if (by == aur.NameDesc || by == aur.None || by == aur.Name) &&
!matchesSearch(&aurResults[i], pkgS) {
continue
}
s.queryMap[dbName][aurResults[i].Name] = aurResults[i]
sortableResults.results = append(sortableResults.results, abstractResult{
@ -194,18 +208,18 @@ func (s *MixedSourceQueryBuilder) Execute(ctx context.Context, dbExecutor db.Exe
s.results = sortableResults.results
if aurErr != nil {
text.Errorln(ErrAURSearch{inner: aurErr})
s.logger.Errorln(ErrAURSearch{inner: aurErr})
if len(repoResults) != 0 {
text.Warnln(gotext.Get("Showing repo packages only"))
s.logger.Warnln(gotext.Get("Showing repo packages only"))
}
}
}
func (s *MixedSourceQueryBuilder) Results(w io.Writer, dbExecutor db.Executor, verboseSearch SearchVerbosity) error {
func (s *SourceQueryBuilder) Results(dbExecutor db.Executor, verboseSearch SearchVerbosity) error {
for i := range s.results {
if verboseSearch == Minimal {
_, _ = fmt.Fprintln(w, s.results[i].name)
s.logger.Println(s.results[i].name)
continue
}
@ -220,34 +234,34 @@ func (s *MixedSourceQueryBuilder) Results(w io.Writer, dbExecutor db.Executor, v
}
pkg := s.queryMap[s.results[i].source][s.results[i].name]
if s.results[i].source == sourceAUR {
aurPkg := pkg.(aur.Pkg)
toPrint += aurPkgSearchString(&aurPkg, dbExecutor, s.singleLineResults)
} else {
syncPkg := pkg.(alpm.IPackage)
toPrint += syncPkgSearchString(syncPkg, dbExecutor, s.singleLineResults)
switch pPkg := pkg.(type) {
case aur.Pkg:
toPrint += aurPkgSearchString(&pPkg, dbExecutor, s.singleLineResults)
case alpm.IPackage:
toPrint += syncPkgSearchString(pPkg, dbExecutor, s.singleLineResults)
}
fmt.Fprintln(w, toPrint)
s.logger.Println(toPrint)
}
return nil
}
func (s *MixedSourceQueryBuilder) Len() int {
func (s *SourceQueryBuilder) Len() int {
return len(s.results)
}
func (s *MixedSourceQueryBuilder) GetTargets(include, exclude intrange.IntRanges,
otherExclude stringset.StringSet,
func (s *SourceQueryBuilder) GetTargets(include, exclude intrange.IntRanges,
otherExclude mapset.Set[string],
) ([]string, error) {
var (
isInclude = len(exclude) == 0 && len(otherExclude) == 0
isInclude = len(exclude) == 0 && otherExclude.Cardinality() == 0
targets []string
lenRes = len(s.results)
)
for i := 0; i <= s.Len(); i++ {
for i := 1; i <= s.Len(); i++ {
target := i - 1
if s.bottomUp {
target = lenRes - i
@ -260,3 +274,25 @@ func (s *MixedSourceQueryBuilder) GetTargets(include, exclude intrange.IntRanges
return targets, nil
}
func matchesSearch(pkg *aur.Pkg, terms []string) bool {
if len(terms) <= 1 {
return true
}
for _, pkgN := range terms {
if strings.IndexFunc(pkgN, unicode.IsSymbol) != -1 {
return true
}
name := strings.ToLower(pkg.Name)
desc := strings.ToLower(pkg.Description)
targ := strings.ToLower(pkgN)
if !strings.Contains(name, targ) && !strings.Contains(desc, targ) {
return false
}
}
return true
}

View File

@ -0,0 +1,362 @@
//go:build !integration
// +build !integration
package query
import (
"context"
"io"
"strings"
"testing"
"github.com/Jguer/aur"
"github.com/Jguer/yay/v12/pkg/db/mock"
mockaur "github.com/Jguer/yay/v12/pkg/dep/mock"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/stretchr/testify/assert"
)
func TestSourceQueryBuilder(t *testing.T) {
t.Parallel()
type testCase struct {
desc string
search []string
bottomUp bool
separateSources bool
sortBy string
verbosity SearchVerbosity
targetMode parser.TargetMode
singleLineResults bool
searchBy string
wantResults []string
wantOutput []string
}
testCases := []testCase{
{
desc: "sort-by-votes bottomup separatesources",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "votes",
verbosity: Detailed,
wantResults: []string{"linux-ck", "linux-zen", "linux"},
wantOutput: []string{
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
},
},
{
desc: "sort-by-votes topdown separatesources",
search: []string{"linux"},
bottomUp: false,
separateSources: true,
sortBy: "votes",
verbosity: Detailed,
wantResults: []string{"linux", "linux-zen", "linux-ck"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
{
desc: "sort-by-votes bottomup noseparatesources",
search: []string{"linux"},
bottomUp: true,
separateSources: false,
sortBy: "votes",
verbosity: Detailed,
wantResults: []string{"linux-zen", "linux-ck", "linux"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
},
},
{
desc: "sort-by-votes topdown noseparatesources",
search: []string{"linux"},
bottomUp: false,
separateSources: false,
sortBy: "votes",
verbosity: Detailed,
wantResults: []string{"linux", "linux-ck", "linux-zen"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
},
},
{
desc: "sort-by-name bottomup separatesources",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: Detailed,
wantResults: []string{"linux-ck", "linux", "linux-zen"},
wantOutput: []string{
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
},
},
{
desc: "sort-by-name topdown separatesources",
search: []string{"linux"},
bottomUp: false,
separateSources: true,
sortBy: "name",
verbosity: Detailed,
wantResults: []string{"linux-zen", "linux", "linux-ck"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
{
desc: "sort-by-name bottomup noseparatesources",
search: []string{"linux"},
bottomUp: true,
separateSources: false,
sortBy: "name",
verbosity: Detailed,
wantResults: []string{"linux", "linux-ck", "linux-zen"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
},
},
{
desc: "sort-by-name topdown noseparatesources",
search: []string{"linux"},
bottomUp: false,
separateSources: false,
sortBy: "name",
verbosity: Detailed,
wantResults: []string{"linux-zen", "linux-ck", "linux"},
wantOutput: []string{
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
},
},
{
desc: "sort-by-votes bottomup separatesources number-menu",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "votes",
verbosity: NumberMenu,
wantResults: []string{"linux-ck", "linux-zen", "linux"},
wantOutput: []string{
"\x1b[35m3\x1b[0m \x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[35m2\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[35m1\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
},
},
{
desc: "sort-by-votes topdown separatesources number-menu",
search: []string{"linux"},
bottomUp: false,
separateSources: true,
sortBy: "votes",
verbosity: NumberMenu,
wantResults: []string{"linux", "linux-zen", "linux-ck"},
wantOutput: []string{
"\x1b[35m1\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[35m2\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[35m3\x1b[0m \x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
{
desc: "sort-by-name bottomup separatesources number-menu",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: NumberMenu,
wantResults: []string{"linux-ck", "linux", "linux-zen"},
wantOutput: []string{
"\x1b[35m3\x1b[0m \x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[35m2\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[35m1\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
},
},
{
desc: "sort-by-name topdown separatesources number-menu",
search: []string{"linux"},
bottomUp: false,
separateSources: true,
sortBy: "name",
verbosity: NumberMenu,
wantResults: []string{"linux-zen", "linux", "linux-ck"},
wantOutput: []string{
"\x1b[35m1\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n",
"\x1b[35m2\x1b[0m \x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n",
"\x1b[35m3\x1b[0m \x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
{
desc: "sort-by-name bottomup noseparatesources minimal",
search: []string{"linux"},
bottomUp: true,
separateSources: false,
sortBy: "name",
verbosity: Minimal,
wantResults: []string{"linux", "linux-ck", "linux-zen"},
wantOutput: []string{
"linux\n",
"linux-ck\n",
"linux-zen\n",
},
},
{
desc: "only-aur minimal",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: Minimal,
targetMode: parser.ModeAUR,
wantResults: []string{"linux-ck"},
wantOutput: []string{
"linux-ck\n",
},
},
{
desc: "only-repo minimal",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: Minimal,
targetMode: parser.ModeRepo,
wantResults: []string{"linux", "linux-zen"},
wantOutput: []string{
"linux\n",
"linux-zen\n",
},
},
{
desc: "sort-by-name singleline",
search: []string{"linux"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: Detailed,
singleLineResults: true,
wantResults: []string{"linux-ck", "linux", "linux-zen"},
wantOutput: []string{
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\tThe Linux-ck kernel and modules with ck's hrtimer patches\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\tThe Linux kernel and modules\n",
"\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\tThe Linux ZEN kernel and modules\n",
},
},
{
desc: "sort-by-name search-by-name",
search: []string{"linux-ck"},
bottomUp: true,
separateSources: true,
sortBy: "name",
verbosity: Detailed,
searchBy: "name",
targetMode: parser.ModeAUR,
wantResults: []string{"linux-ck"},
wantOutput: []string{
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
{
desc: "only-aur search-by-several-terms",
search: []string{"linux-ck", "hrtimer"},
bottomUp: true,
separateSources: true,
verbosity: Detailed,
targetMode: parser.ModeAUR,
wantResults: []string{"linux-ck"},
wantOutput: []string{
"\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
},
}
mockDB := &mock.DBExecutor{
SyncPackagesFn: func(pkgs ...string) []mock.IPackage {
mockDB := mock.NewDB("core")
return []mock.IPackage{
&mock.Package{
PName: "linux",
PVersion: "5.16.0",
PDescription: "The Linux kernel and modules",
PSize: 1,
PISize: 1,
PDB: mockDB,
},
&mock.Package{
PName: "linux-zen",
PVersion: "5.16.0",
PDescription: "The Linux ZEN kernel and modules",
PSize: 1,
PISize: 1,
PDB: mockDB,
},
}
},
LocalPackageFn: func(string) mock.IPackage {
return nil
},
}
mockAUR := &mockaur.MockAUR{
GetFn: func(ctx context.Context, query *aur.Query) ([]aur.Pkg, error) {
return []aur.Pkg{
{
Description: "The Linux-ck kernel and modules with ck's hrtimer patches",
FirstSubmitted: 1311346274,
ID: 1045311,
LastModified: 1646250901,
Maintainer: "graysky",
Name: "linux-ck",
NumVotes: 450,
OutOfDate: 0,
PackageBase: "linux-ck",
PackageBaseID: 50911,
Popularity: 1.511141,
URL: "https://wiki.archlinux.org/index.php/Linux-ck",
URLPath: "/cgit/aur.git/snapshot/linux-ck.tar.gz",
Version: "5.16.12-1",
},
}, nil
},
}
for _, tc := range testCases {
t.Run(tc.desc, func(t *testing.T) {
w := &strings.Builder{}
queryBuilder := NewSourceQueryBuilder(mockAUR,
text.NewLogger(w, io.Discard, strings.NewReader(""), false, "test"),
tc.sortBy, tc.targetMode, tc.searchBy, tc.bottomUp,
tc.singleLineResults, tc.separateSources)
queryBuilder.Execute(context.Background(), mockDB, tc.search)
assert.Len(t, queryBuilder.results, len(tc.wantResults))
assert.Equal(t, len(tc.wantResults), queryBuilder.Len())
for i, name := range tc.wantResults {
assert.Equal(t, name, queryBuilder.results[i].name)
}
queryBuilder.Results(mockDB, tc.verbosity)
assert.Equal(t, strings.Join(tc.wantOutput, ""), w.String())
})
}
}

View File

@ -2,193 +2,32 @@ package query
import (
"context"
"io"
"sort"
"strings"
"github.com/Jguer/aur"
"github.com/Jguer/go-alpm/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/intrange"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/yay/v11/pkg/stringset"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/hashicorp/go-multierror"
)
type SearchVerbosity int
// Verbosity settings for search.
const (
NumberMenu SearchVerbosity = iota
Detailed
Minimal
)
type SourceQueryBuilder struct {
repoQuery
aurQuery
sortBy string
searchBy string
targetMode parser.TargetMode
bottomUp bool
singleLineResults bool
}
func NewSourceQueryBuilder(
sortBy string,
targetMode parser.TargetMode,
searchBy string,
bottomUp,
singleLineResults bool,
) *SourceQueryBuilder {
return &SourceQueryBuilder{
repoQuery: []alpm.IPackage{},
aurQuery: []aur.Pkg{},
bottomUp: bottomUp,
sortBy: sortBy,
targetMode: targetMode,
searchBy: searchBy,
singleLineResults: singleLineResults,
}
}
func (s *SourceQueryBuilder) Execute(ctx context.Context, dbExecutor db.Executor, aurClient aur.ClientInterface, pkgS []string) {
var aurErr error
pkgS = RemoveInvalidTargets(pkgS, s.targetMode)
if s.targetMode.AtLeastAUR() {
s.aurQuery, aurErr = queryAUR(ctx, aurClient, pkgS, s.searchBy)
s.aurQuery = filterAURResults(pkgS, s.aurQuery)
sort.Sort(aurSortable{aurQuery: s.aurQuery, sortBy: s.sortBy, bottomUp: s.bottomUp})
}
if s.targetMode.AtLeastRepo() {
s.repoQuery = repoQuery(dbExecutor.SyncPackages(pkgS...))
if s.bottomUp {
s.Reverse()
}
}
if aurErr != nil && len(s.repoQuery) != 0 {
text.Errorln(ErrAURSearch{inner: aurErr})
text.Warnln(gotext.Get("Showing repo packages only"))
}
}
func (s *SourceQueryBuilder) Results(w io.Writer, dbExecutor db.Executor, verboseSearch SearchVerbosity) error {
if s.aurQuery == nil || s.repoQuery == nil {
return ErrNoQuery{}
}
if s.bottomUp {
if s.targetMode.AtLeastAUR() {
s.aurQuery.printSearch(w, len(s.repoQuery)+1, dbExecutor, verboseSearch, s.bottomUp, s.singleLineResults)
}
if s.targetMode.AtLeastRepo() {
s.repoQuery.printSearch(w, dbExecutor, verboseSearch, s.bottomUp, s.singleLineResults)
}
} else {
if s.targetMode.AtLeastRepo() {
s.repoQuery.printSearch(w, dbExecutor, verboseSearch, s.bottomUp, s.singleLineResults)
}
if s.targetMode.AtLeastAUR() {
s.aurQuery.printSearch(w, len(s.repoQuery)+1, dbExecutor, verboseSearch, s.bottomUp, s.singleLineResults)
}
}
return nil
}
func (s *SourceQueryBuilder) Len() int {
return len(s.repoQuery) + len(s.aurQuery)
}
func (s *SourceQueryBuilder) GetTargets(include, exclude intrange.IntRanges,
otherExclude stringset.StringSet,
) ([]string, error) {
isInclude := len(exclude) == 0 && len(otherExclude) == 0
var targets []string
for i, pkg := range s.repoQuery {
var target int
if s.bottomUp {
target = len(s.repoQuery) - i
} else {
target = i + 1
}
if (isInclude && include.Get(target)) || (!isInclude && !exclude.Get(target)) {
targets = append(targets, pkg.DB().Name()+"/"+pkg.Name())
}
}
for i := range s.aurQuery {
var target int
if s.bottomUp {
target = len(s.aurQuery) - i + len(s.repoQuery)
} else {
target = i + 1 + len(s.repoQuery)
}
if (isInclude && include.Get(target)) || (!isInclude && !exclude.Get(target)) {
targets = append(targets, "aur/"+s.aurQuery[i].Name)
}
}
return targets, nil
}
// filter AUR results to remove strings that don't contain all of the search terms.
func filterAURResults(pkgS []string, results []aur.Pkg) []aur.Pkg {
aurPkgs := make([]aur.Pkg, 0, len(results))
matchesSearchTerms := func(pkg *aur.Pkg, terms []string) bool {
for _, pkgN := range terms {
name := strings.ToLower(pkg.Name)
desc := strings.ToLower(pkg.Description)
targ := strings.ToLower(pkgN)
if !(strings.Contains(name, targ) || strings.Contains(desc, targ)) {
return false
}
}
return true
}
for i := range results {
if matchesSearchTerms(&results[i], pkgS) {
aurPkgs = append(aurPkgs, results[i])
}
}
return aurPkgs
}
// queryAUR searches AUR and narrows based on subarguments.
func queryAUR(ctx context.Context, aurClient aur.ClientInterface, pkgS []string, searchBy string) ([]aur.Pkg, error) {
func queryAUR(ctx context.Context,
aurClient aur.QueryClient,
pkgS []string, searchBy string,
) ([]aur.Pkg, error) {
var (
err error
by = getSearchBy(searchBy)
)
for _, word := range pkgS {
var r []aur.Pkg
r, err = aurClient.Search(ctx, word, by)
if err == nil {
r, errM := aurClient.Get(ctx, &aur.Query{
Needles: []string{word},
By: by,
Contains: true,
})
if errM == nil {
return r, nil
}
err = multierror.Append(err, errM)
}
return nil, err

View File

@ -1,136 +0,0 @@
package query
import (
"bytes"
"context"
"io"
"net/http"
"strings"
"testing"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/db/mock"
"github.com/Jguer/yay/v11/pkg/settings/parser"
"github.com/Jguer/aur"
"github.com/Jguer/go-alpm/v2"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
const validPayload = `{
"resultcount": 1,
"results": [
{
"Description": "The Linux-ck kernel and modules with ck's hrtimer patches",
"FirstSubmitted": 1311346274,
"ID": 1045311,
"LastModified": 1646250901,
"Maintainer": "graysky",
"Name": "linux-ck",
"NumVotes": 450,
"OutOfDate": null,
"PackageBase": "linux-ck",
"PackageBaseID": 50911,
"Popularity": 1.511141,
"URL": "https://wiki.archlinux.org/index.php/Linux-ck",
"URLPath": "/cgit/aur.git/snapshot/linux-ck.tar.gz",
"Version": "5.16.12-1"
}
],
"type": "search",
"version": 5
}
`
type mockDB struct {
db.Executor
}
func (m *mockDB) LocalPackage(string) alpm.IPackage {
return nil
}
func (m *mockDB) PackageGroups(pkg alpm.IPackage) []string {
return []string{}
}
func (m *mockDB) SyncPackages(...string) []alpm.IPackage {
mockDB := mock.NewDB("core")
linuxRepo := &mock.Package{
PName: "linux",
PVersion: "5.16.0",
PDescription: "The Linux kernel and modules",
PSize: 1,
PISize: 1,
PDB: mockDB,
}
linuxZen := &mock.Package{
PName: "linux-zen",
PVersion: "5.16.0",
PDescription: "The Linux ZEN kernel and modules",
PSize: 1,
PISize: 1,
PDB: mockDB,
}
return []alpm.IPackage{linuxRepo, linuxZen}
}
type mockDoer struct{}
func (m *mockDoer) Do(req *http.Request) (*http.Response, error) {
return &http.Response{
StatusCode: http.StatusOK,
Body: io.NopCloser(bytes.NewBufferString(validPayload)),
}, nil
}
func TestSourceQueryBuilder(t *testing.T) {
t.Parallel()
type testCase struct {
desc string
bottomUp bool
want string
}
testCases := []testCase{
{desc: "bottomup", bottomUp: true, want: "\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n"},
{
desc: "topdown", bottomUp: false,
want: "\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux kernel and modules\n\x1b[1m\x1b[33mcore\x1b[0m\x1b[0m/\x1b[1mlinux-zen\x1b[0m \x1b[36m5.16.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n The Linux ZEN kernel and modules\n\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mlinux-ck\x1b[0m \x1b[36m5.16.12-1\x1b[0m\x1b[1m (+450\x1b[0m \x1b[1m1.51) \x1b[0m\n The Linux-ck kernel and modules with ck's hrtimer patches\n",
},
}
for _, tc := range testCases {
t.Run(tc.desc, func(t *testing.T) {
queryBuilder := NewSourceQueryBuilder("votes", parser.ModeAny, "", tc.bottomUp, false)
search := []string{"linux"}
mockStore := &mockDB{}
client, err := aur.NewClient(aur.WithHTTPClient(&mockDoer{}))
require.NoError(t, err)
queryBuilder.Execute(context.Background(), mockStore, client, search)
assert.Len(t, queryBuilder.aurQuery, 1)
assert.Len(t, queryBuilder.repoQuery, 2)
assert.Equal(t, 3, queryBuilder.Len())
assert.Equal(t, "linux-ck", queryBuilder.aurQuery[0].Name)
if tc.bottomUp {
assert.Equal(t, "linux-zen", queryBuilder.repoQuery[0].Name())
assert.Equal(t, "linux", queryBuilder.repoQuery[1].Name())
} else {
assert.Equal(t, "linux-zen", queryBuilder.repoQuery[1].Name())
assert.Equal(t, "linux", queryBuilder.repoQuery[0].Name())
}
w := &strings.Builder{}
queryBuilder.Results(w, mockStore, Detailed)
wString := w.String()
require.GreaterOrEqual(t, len(wString), 1)
assert.Equal(t, tc.want, wString)
})
}
}

View File

@ -2,74 +2,17 @@ package query
import (
"fmt"
"io"
"strconv"
"github.com/Jguer/aur"
"github.com/Jguer/go-alpm/v2"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v11/pkg/db"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/yay/v12/pkg/db"
"github.com/Jguer/yay/v12/pkg/text"
)
type (
aurQuery []aur.Pkg // Query is a collection of Results.
repoQuery []alpm.IPackage // Query holds the results of a repository search.
)
type aurSortable struct {
aurQuery
sortBy string
bottomUp bool
}
func (r repoQuery) Reverse() {
for i, j := 0, len(r)-1; i < j; i, j = i+1, j-1 {
r[i], r[j] = r[j], r[i]
}
}
func (r repoQuery) Less(i, j int) bool {
return text.LessRunes([]rune(r[i].Name()), []rune(r[j].Name()))
}
func (q aurSortable) Len() int {
return len(q.aurQuery)
}
func (q aurSortable) Less(i, j int) bool {
var result bool
switch q.sortBy {
case "votes":
result = q.aurQuery[i].NumVotes > q.aurQuery[j].NumVotes
case "popularity":
result = q.aurQuery[i].Popularity > q.aurQuery[j].Popularity
case "name":
result = text.LessRunes([]rune(q.aurQuery[i].Name), []rune(q.aurQuery[j].Name))
case "base":
result = text.LessRunes([]rune(q.aurQuery[i].PackageBase), []rune(q.aurQuery[j].PackageBase))
case "submitted":
result = q.aurQuery[i].FirstSubmitted < q.aurQuery[j].FirstSubmitted
case "modified":
result = q.aurQuery[i].LastModified < q.aurQuery[j].LastModified
case "id":
result = q.aurQuery[i].ID < q.aurQuery[j].ID
case "baseid":
result = q.aurQuery[i].PackageBaseID < q.aurQuery[j].PackageBaseID
}
if q.bottomUp {
return !result
}
return result
}
func (q aurSortable) Swap(i, j int) {
q.aurQuery[i], q.aurQuery[j] = q.aurQuery[j], q.aurQuery[i]
}
type Pkg = aur.Pkg
func getSearchBy(value string) aur.By {
switch value {
@ -77,6 +20,8 @@ func getSearchBy(value string) aur.By {
return aur.Name
case "maintainer":
return aur.Maintainer
case "submitter":
return aur.Submitter
case "depends":
return aur.Depends
case "makedepends":
@ -85,41 +30,23 @@ func getSearchBy(value string) aur.By {
return aur.OptDepends
case "checkdepends":
return aur.CheckDepends
case "provides":
return aur.Provides
case "conflicts":
return aur.Conflicts
case "replaces":
return aur.Replaces
case "groups":
return aur.Groups
case "keywords":
return aur.Keywords
case "comaintainers":
return aur.CoMaintainers
default:
return aur.NameDesc
}
}
// PrintSearch handles printing search results in a given format.
func (q aurQuery) printSearch(
w io.Writer,
start int,
dbExecutor db.Executor,
searchMode SearchVerbosity,
bottomUp,
singleLineResults bool,
) {
for i := range q {
if searchMode == Minimal {
_, _ = fmt.Fprintln(w, q[i].Name)
continue
}
var toprint string
if searchMode == NumberMenu {
if bottomUp {
toprint += text.Magenta(strconv.Itoa(len(q)+start-i-1) + " ")
} else {
toprint += text.Magenta(strconv.Itoa(start+i) + " ")
}
}
toprint += aurPkgSearchString(&q[i], dbExecutor, singleLineResults)
_, _ = fmt.Fprintln(w, toprint)
}
}
func aurPkgSearchString(
pkg *aur.Pkg,
dbExecutor db.Executor,
@ -157,29 +84,6 @@ func aurPkgSearchString(
return toPrint
}
// PrintSearch receives a RepoSearch type and outputs pretty text.
func (r repoQuery) printSearch(w io.Writer, dbExecutor db.Executor, searchMode SearchVerbosity, bottomUp, singleLineResults bool) {
for i, res := range r {
if searchMode == Minimal {
_, _ = fmt.Fprintln(w, res.Name())
continue
}
var toprint string
if searchMode == NumberMenu {
if bottomUp {
toprint += text.Magenta(strconv.Itoa(len(r)-i) + " ")
} else {
toprint += text.Magenta(strconv.Itoa(i+1) + " ")
}
}
toprint += syncPkgSearchString(res, dbExecutor, singleLineResults)
_, _ = fmt.Fprintln(w, toprint)
}
}
// PrintSearch receives a RepoSearch type and outputs pretty text.
func syncPkgSearchString(pkg alpm.IPackage, dbExecutor db.Executor, singleLineResults bool) string {
toPrint := text.Bold(text.ColorHash(pkg.DB().Name())) + "/" + text.Bold(pkg.Name()) +

View File

@ -1,249 +0,0 @@
package query
import (
"strings"
"testing"
"github.com/stretchr/testify/assert"
"github.com/Jguer/yay/v11/pkg/db/mock"
"github.com/Jguer/yay/v11/pkg/text"
"github.com/Jguer/aur"
)
var (
pkgA = aur.Pkg{
Name: "package-a",
Version: "1.0.0",
Description: "Package A description",
Maintainer: "Package A Maintainer",
}
pkgARepo = &mock.Package{
PName: pkgA.Name,
PVersion: pkgA.Version,
PDescription: pkgA.Description,
PSize: 1,
PISize: 1,
PDB: mock.NewDB("dba"),
}
pkgB = aur.Pkg{
Name: "package-b",
Version: "1.0.0",
Description: "Package B description",
Maintainer: "Package B Maintainer",
}
pkgBRepo = &mock.Package{
PName: pkgB.Name,
PVersion: pkgB.Version,
PDescription: pkgB.Description,
PSize: 1,
PISize: 1,
PDB: mock.NewDB("dbb"),
}
)
func Test_aurQuery_printSearch(t *testing.T) {
type args struct {
searchMode SearchVerbosity
singleLineResults bool
}
tests := []struct {
name string
q aurQuery
args args
useColor bool
want string
}{
{
name: "AUR,Minimal,NoColor",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: Minimal,
},
want: "package-a\npackage-b\n",
},
{
name: "AUR,DoubleLine,NumberMenu,NoColor",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: NumberMenu,
singleLineResults: false,
},
want: "1 aur/package-a 1.0.0 (+0 0.00) \n Package A description\n2 aur/package-b 1.0.0 (+0 0.00) \n Package B description\n",
},
{
name: "AUR,SingleLine,NumberMenu,NoColor",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: NumberMenu,
singleLineResults: true,
},
want: "1 aur/package-a 1.0.0 (+0 0.00) \tPackage A description\n2 aur/package-b 1.0.0 (+0 0.00) \tPackage B description\n",
},
{
name: "AUR,DoubleLine,Detailed,NoColor",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: Detailed,
singleLineResults: false,
},
want: "aur/package-a 1.0.0 (+0 0.00) \n Package A description\naur/package-b 1.0.0 (+0 0.00) \n Package B description\n",
},
{
name: "AUR,SingleLine,Detailed,NoColor",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
want: "aur/package-a 1.0.0 (+0 0.00) \tPackage A description\naur/package-b 1.0.0 (+0 0.00) \tPackage B description\n",
},
{
name: "AUR,DoubleLine,Detailed,Color",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: Detailed,
singleLineResults: false,
},
useColor: true,
want: "\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mpackage-a\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (+0\x1b[0m \x1b[1m0.00) \x1b[0m\n Package A description\n\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mpackage-b\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (+0\x1b[0m \x1b[1m0.00) \x1b[0m\n Package B description\n",
},
{
name: "AUR,SingleLine,Detailed,Color",
q: aurQuery{pkgA, pkgB},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
useColor: true,
want: "\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mpackage-a\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (+0\x1b[0m \x1b[1m0.00) \x1b[0m\tPackage A description\n\x1b[1m\x1b[34maur\x1b[0m\x1b[0m/\x1b[1mpackage-b\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (+0\x1b[0m \x1b[1m0.00) \x1b[0m\tPackage B description\n",
},
{
name: "AUR,NoPackages",
q: aurQuery{},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
useColor: true,
want: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
w := &strings.Builder{}
executor := mock.DBExecutor{}
text.UseColor = tt.useColor
// Fire
tt.q.printSearch(w, 1, executor, tt.args.searchMode, false, tt.args.singleLineResults)
got := w.String()
assert.Equal(t, tt.want, got)
})
}
}
func Test_repoQuery_printSearch(t *testing.T) {
type args struct {
searchMode SearchVerbosity
singleLineResults bool
}
tests := []struct {
name string
q repoQuery
args args
useColor bool
want string
}{
{
name: "REPO,Minimal,NoColor",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: Minimal,
},
want: "package-a\npackage-b\n",
},
{
name: "REPO,DoubleLine,NumberMenu,NoColor",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: NumberMenu,
singleLineResults: false,
},
want: "1 dba/package-a 1.0.0 (1.0 B 1.0 B) \n Package A description\n2 dbb/package-b 1.0.0 (1.0 B 1.0 B) \n Package B description\n",
},
{
name: "REPO,SingleLine,NumberMenu,NoColor",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: NumberMenu,
singleLineResults: true,
},
want: "1 dba/package-a 1.0.0 (1.0 B 1.0 B) \tPackage A description\n2 dbb/package-b 1.0.0 (1.0 B 1.0 B) \tPackage B description\n",
},
{
name: "REPO,DoubleLine,Detailed,NoColor",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: Detailed,
singleLineResults: false,
},
want: "dba/package-a 1.0.0 (1.0 B 1.0 B) \n Package A description\ndbb/package-b 1.0.0 (1.0 B 1.0 B) \n Package B description\n",
},
{
name: "REPO,SingleLine,Detailed,NoColor",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
want: "dba/package-a 1.0.0 (1.0 B 1.0 B) \tPackage A description\ndbb/package-b 1.0.0 (1.0 B 1.0 B) \tPackage B description\n",
},
{
name: "AUR,DoubleLine,Detailed,Color",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: Detailed,
singleLineResults: false,
},
useColor: true,
want: "\x1b[1m\x1b[35mdba\x1b[0m\x1b[0m/\x1b[1mpackage-a\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n Package A description\n\x1b[1m\x1b[36mdbb\x1b[0m\x1b[0m/\x1b[1mpackage-b\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\n Package B description\n",
},
{
name: "REPO,SingleLine,Detailed,Color",
q: repoQuery{pkgARepo, pkgBRepo},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
useColor: true,
want: "\x1b[1m\x1b[35mdba\x1b[0m\x1b[0m/\x1b[1mpackage-a\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\tPackage A description\n\x1b[1m\x1b[36mdbb\x1b[0m\x1b[0m/\x1b[1mpackage-b\x1b[0m \x1b[36m1.0.0\x1b[0m\x1b[1m (1.0 B 1.0 B) \x1b[0m\tPackage B description\n",
},
{
name: "REPO,NoPackages",
q: repoQuery{},
args: args{
searchMode: Detailed,
singleLineResults: true,
},
useColor: true,
want: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
w := &strings.Builder{}
executor := mock.DBExecutor{}
text.UseColor = tt.useColor
// Fire
tt.q.printSearch(w, executor, tt.args.searchMode, false, tt.args.singleLineResults)
got := w.String()
assert.Equal(t, tt.want, got)
})
}
}

79
pkg/query/version_diff.go Normal file
View File

@ -0,0 +1,79 @@
package query
import (
"strings"
"unicode"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/go-alpm/v2"
)
func GetVersionDiff(oldVersion, newVersion string) (left, right string) {
if oldVersion == newVersion {
return oldVersion + text.Red(""), newVersion + text.Green("")
}
diffPosition := 0
checkWords := func(str string, index int, words ...string) bool {
// Make sure the word is not part of a longer word
ongoingWord := unicode.IsLetter(rune(str[index]))
if ongoingWord {
return false
}
for _, word := range words {
wordLength := len(word)
nextIndex := index + 1
if (index < len(str)-wordLength) &&
(str[nextIndex:(nextIndex+wordLength)] == word) {
return true
}
}
return false
}
for index, char := range oldVersion {
charIsSpecial := !unicode.IsLetter(char) && !unicode.IsNumber(char)
if (index >= len(newVersion)) || (char != rune(newVersion[index])) {
if charIsSpecial {
diffPosition = index
}
break
}
if charIsSpecial ||
(((index == len(oldVersion)-1) || (index == len(newVersion)-1)) &&
((len(oldVersion) != len(newVersion)) ||
(oldVersion[index] == newVersion[index]))) ||
checkWords(oldVersion, index, "rc", "pre", "alpha", "beta") {
diffPosition = index + 1
}
}
samePart := oldVersion[0:diffPosition]
left = samePart + text.Red(oldVersion[diffPosition:])
right = samePart + text.Green(newVersion[diffPosition:])
return left, right
}
func isDevelName(name string) bool {
for _, suffix := range []string{"git", "svn", "hg", "bzr", "nightly", "insiders-bin"} {
if strings.HasSuffix(name, "-"+suffix) {
return true
}
}
return strings.Contains(name, "-always-")
}
func isDevelPackage(pkg alpm.IPackage) bool {
return isDevelName(pkg.Name()) || isDevelName(pkg.Base())
}

View File

@ -0,0 +1,68 @@
//go:build !integration
// +build !integration
package query
import (
"testing"
"github.com/Jguer/yay/v12/pkg/text"
)
func TestVersionDiff(t *testing.T) {
testCases := []struct {
name string
a string
b string
wantDiff string
}{
{
name: "1.0.0-1 -> 1.0.0-2",
a: "1.0.0-1",
b: "1.0.0-2",
wantDiff: "1.0.0-" + text.Red("1") + " " + "1.0.0-" + text.Green("2"),
},
{
name: "1.0.0-1 -> 1.0.1-1",
a: "1.0.0-1",
b: "1.0.1-1",
wantDiff: "1.0." + text.Red("0-1") + " " + "1.0." + text.Green("1-1"),
},
{
name: "3.0.0~alpha7-3 -> 3.0.0~alpha7-4",
a: "3.0.0~alpha7-3",
b: "3.0.0~alpha7-4",
wantDiff: "3.0.0~alpha7-" + text.Red("3") + " " + "3.0.0~alpha7-" + text.Green("4"),
},
{
name: "3.0.0~beta7-3 -> 3.0.0~beta8-3",
a: "3.0.0~beta7-3",
b: "3.0.0~beta8-3",
wantDiff: "3.0.0~" + text.Red("beta7-3") + " " + "3.0.0~" + text.Green("beta8-3"),
},
{
name: "23.04.r131.b1bfe05-1 -> 23.04.r131.b1bfe07-1",
a: "23.04.r131.b1bfe05-1",
b: "23.04.r131.b1bfe07-1",
wantDiff: "23.04.r131." + text.Red("b1bfe05-1") + " " + "23.04.r131." + text.Green("b1bfe07-1"),
},
{
name: "1.0.arch0-1 -> 1.0.arch1-2",
a: "1.0.arch0-1",
b: "1.0.arch1-2",
wantDiff: "1.0." + text.Red("arch0-1") + " " + "1.0." + text.Green("arch1-2"),
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
originalUseColor := text.UseColor
text.UseColor = true
left, right := GetVersionDiff(tc.a, tc.b)
gotDiff := left + " " + right
if gotDiff != tc.wantDiff {
t.Errorf("VersionDiff(%s, %s) = %s, want %s", tc.a, tc.b, gotDiff, tc.wantDiff)
}
text.UseColor = originalUseColor
})
}
}

65
pkg/runtime/pacman.go Normal file
View File

@ -0,0 +1,65 @@
package runtime
import (
"fmt"
"os"
"github.com/Jguer/yay/v12/pkg/settings/parser"
pacmanconf "github.com/Morganamilo/go-pacmanconf"
"golang.org/x/term"
)
func retrievePacmanConfig(cmdArgs *parser.Arguments, pacmanConfigPath string) (*pacmanconf.Config, bool, error) {
root := "/"
if value, _, exists := cmdArgs.GetArg("root", "r"); exists {
root = value
}
pacmanConf, stderr, err := pacmanconf.PacmanConf("--config", pacmanConfigPath, "--root", root)
if err != nil {
cmdErr := err
if stderr != "" {
cmdErr = fmt.Errorf("%w\n%s", err, stderr)
}
return nil, false, cmdErr
}
if dbPath, _, exists := cmdArgs.GetArg("dbpath", "b"); exists {
pacmanConf.DBPath = dbPath
}
if arch := cmdArgs.GetArgs("arch"); arch != nil {
pacmanConf.Architecture = append(pacmanConf.Architecture, arch...)
}
if ignoreArray := cmdArgs.GetArgs("ignore"); ignoreArray != nil {
pacmanConf.IgnorePkg = append(pacmanConf.IgnorePkg, ignoreArray...)
}
if ignoreGroupsArray := cmdArgs.GetArgs("ignoregroup"); ignoreGroupsArray != nil {
pacmanConf.IgnoreGroup = append(pacmanConf.IgnoreGroup, ignoreGroupsArray...)
}
if cacheArray := cmdArgs.GetArgs("cachedir"); cacheArray != nil {
pacmanConf.CacheDir = cacheArray
}
if gpgDir, _, exists := cmdArgs.GetArg("gpgdir"); exists {
pacmanConf.GPGDir = gpgDir
}
useColor := pacmanConf.Color && term.IsTerminal(int(os.Stdout.Fd()))
switch value, _, _ := cmdArgs.GetArg("color"); value {
case "always":
useColor = true
case "auto":
useColor = term.IsTerminal(int(os.Stdout.Fd()))
case "never":
useColor = false
}
return pacmanConf, useColor, nil
}

View File

@ -0,0 +1,66 @@
//go:build !integration
// +build !integration
package runtime
import (
"path/filepath"
"runtime"
"testing"
"github.com/Morganamilo/go-pacmanconf"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/Jguer/yay/v12/pkg/settings/parser"
)
func TestPacmanConf(t *testing.T) {
t.Parallel()
path := "../../testdata/pacman.conf"
absPath, err := filepath.Abs(path)
require.NoError(t, err)
// detect the architecture of the system
expectedArch := []string{"x86_64"}
if runtime.GOARCH == "arm64" {
expectedArch = []string{"aarch64"}
}
expectedPacmanConf := &pacmanconf.Config{
RootDir: "/", DBPath: "/var/lib/pacman/",
CacheDir: []string{"/var/cache/pacman/pkg/"},
HookDir: []string{"/etc/pacman.d/hooks/"},
GPGDir: "/etc/pacman.d/gnupg/", LogFile: "/var/log/pacman.log",
HoldPkg: []string{"pacman", "glibc"}, IgnorePkg: []string{"xorm"},
IgnoreGroup: []string{"yorm"}, Architecture: expectedArch,
XferCommand: "/usr/bin/wget --passive-ftp -c -O %o %u",
NoUpgrade: []string(nil), NoExtract: []string(nil), CleanMethod: []string{"KeepInstalled"},
SigLevel: []string{"PackageRequired", "PackageTrustedOnly", "DatabaseOptional", "DatabaseTrustedOnly"},
LocalFileSigLevel: []string{"PackageOptional", "PackageTrustedOnly"},
RemoteFileSigLevel: []string{"PackageRequired", "PackageTrustedOnly"}, UseSyslog: true,
Color: true, UseDelta: 0, TotalDownload: false, CheckSpace: true,
VerbosePkgLists: true, DisableDownloadTimeout: false,
Repos: []pacmanconf.Repository{
{
Name: "core", Servers: []string{"Core"},
SigLevel: []string(nil), Usage: []string{"All"},
},
{
Name: "extra", Servers: []string{"Extra"}, SigLevel: []string(nil),
Usage: []string{"All"},
},
{
Name: "multilib", Servers: []string{"repo3", "multilib"},
SigLevel: []string(nil), Usage: []string{"All"},
},
},
}
pacmanConf, color, err := retrievePacmanConfig(parser.MakeArguments(), absPath)
assert.Nil(t, err)
assert.NotNil(t, pacmanConf)
assert.Equal(t, color, false)
assert.EqualValues(t, expectedPacmanConf, pacmanConf)
}

138
pkg/runtime/runtime.go Normal file
View File

@ -0,0 +1,138 @@
package runtime
import (
"context"
"fmt"
"net/http"
"os"
"path/filepath"
"github.com/leonelquinteros/gotext"
"github.com/Jguer/yay/v12/pkg/query"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/exe"
"github.com/Jguer/yay/v12/pkg/settings/parser"
"github.com/Jguer/yay/v12/pkg/text"
"github.com/Jguer/yay/v12/pkg/vcs"
"github.com/Jguer/aur"
"github.com/Jguer/aur/metadata"
"github.com/Jguer/aur/rpc"
"github.com/Jguer/votar/pkg/vote"
"github.com/Morganamilo/go-pacmanconf"
"golang.org/x/net/proxy"
)
type Runtime struct {
Cfg *settings.Configuration
QueryBuilder query.Builder
PacmanConf *pacmanconf.Config
VCSStore vcs.Store
CmdBuilder exe.ICmdBuilder
HTTPClient *http.Client
VoteClient *vote.Client
AURClient aur.QueryClient
Logger *text.Logger
}
func NewRuntime(cfg *settings.Configuration, cmdArgs *parser.Arguments, version string) (*Runtime, error) {
logger := text.NewLogger(os.Stdout, os.Stderr, os.Stdin, cfg.Debug, "runtime")
runner := exe.NewOSRunner(logger.Child("runner"))
transport := http.DefaultTransport.(*http.Transport).Clone()
if socks5_proxy := os.Getenv("SOCKS5_PROXY"); socks5_proxy != "" {
dialer, err := proxy.SOCKS5("tcp", socks5_proxy, nil, proxy.Direct)
if err != nil {
return nil, err
}
transport = &http.Transport{Dial: dialer.Dial}
}
httpClient := &http.Client{
CheckRedirect: func(req *http.Request, via []*http.Request) error {
return http.ErrUseLastResponse
},
Transport: transport,
}
userAgent := fmt.Sprintf("Yay/%s", version)
voteClient, errVote := vote.NewClient(vote.WithUserAgent(userAgent),
vote.WithHTTPClient(httpClient))
if errVote != nil {
return nil, errVote
}
voteClient.SetCredentials(
os.Getenv("AUR_USERNAME"),
os.Getenv("AUR_PASSWORD"))
userAgentFn := func(ctx context.Context, req *http.Request) error {
req.Header.Set("User-Agent", userAgent)
return nil
}
var aurCache aur.QueryClient
aurCache, errAURCache := metadata.New(
metadata.WithHTTPClient(httpClient),
metadata.WithCacheFilePath(filepath.Join(cfg.BuildDir, "aur.json")),
metadata.WithRequestEditorFn(userAgentFn),
metadata.WithBaseURL(cfg.AURURL),
metadata.WithDebugLogger(logger.Debugln),
)
if errAURCache != nil {
return nil, fmt.Errorf(gotext.Get("failed to retrieve aur Cache")+": %w", errAURCache)
}
aurClient, errAUR := rpc.NewClient(
rpc.WithHTTPClient(httpClient),
rpc.WithBaseURL(cfg.AURRPCURL),
rpc.WithRequestEditorFn(userAgentFn),
rpc.WithLogFn(logger.Debugln))
if errAUR != nil {
return nil, errAUR
}
if cfg.UseRPC {
aurCache = aurClient
}
pacmanConf, useColor, err := retrievePacmanConfig(cmdArgs, cfg.PacmanConf)
if err != nil {
return nil, err
}
// FIXME: get rid of global
text.UseColor = useColor
cmdBuilder := exe.NewCmdBuilder(cfg, runner, logger.Child("cmdbuilder"), pacmanConf.DBPath)
vcsStore := vcs.NewInfoStore(
cfg.VCSFilePath, cmdBuilder,
logger.Child("vcs"))
if err := vcsStore.Load(); err != nil {
return nil, err
}
queryBuilder := query.NewSourceQueryBuilder(
aurClient,
logger.Child("mixed.querybuilder"), cfg.SortBy,
cfg.Mode, cfg.SearchBy,
cfg.BottomUp, cfg.SingleLineResults, cfg.SeparateSources)
run := &Runtime{
Cfg: cfg,
QueryBuilder: queryBuilder,
PacmanConf: pacmanConf,
VCSStore: vcsStore,
CmdBuilder: cmdBuilder,
HTTPClient: &http.Client{},
VoteClient: voteClient,
AURClient: aurCache,
Logger: logger,
}
return run, nil
}

View File

@ -0,0 +1,52 @@
//go:build !integration
// +build !integration
package runtime_test
import (
"path/filepath"
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/Jguer/yay/v12/pkg/runtime"
"github.com/Jguer/yay/v12/pkg/settings"
"github.com/Jguer/yay/v12/pkg/settings/parser"
)
func TestBuildRuntime(t *testing.T) {
t.Parallel()
path := "../../testdata/pacman.conf"
absPath, err := filepath.Abs(path)
require.NoError(t, err)
// Prepare test inputs
cfg := &settings.Configuration{
Debug: true,
UseRPC: false,
AURURL: "https://aur.archlinux.org",
AURRPCURL: "https://aur.archlinux.org/rpc",
BuildDir: "/tmp",
VCSFilePath: "",
PacmanConf: absPath,
}
cmdArgs := parser.MakeArguments()
version := "1.0.0"
// Call the function being tested
run, err := runtime.NewRuntime(cfg, cmdArgs, version)
require.NoError(t, err)
// Assert the function's output
assert.NotNil(t, run)
assert.NotNil(t, run.QueryBuilder)
assert.NotNil(t, run.PacmanConf)
assert.NotNil(t, run.VCSStore)
assert.NotNil(t, run.CmdBuilder)
assert.NotNil(t, run.HTTPClient)
assert.NotNil(t, run.VoteClient)
assert.NotNil(t, run.AURClient)
assert.NotNil(t, run.Logger)
}

Some files were not shown because too many files have changed in this diff Show More