Age | Commit message (Collapse) | Author |
|
If we have files in partial/ from a previous invocation or similar such
those could be symlinks created by file:// sources. The code is
expecting only real files through and happily changes owner,
modification times and permission on the file the symlink points to
which tend to be files we have no business in touching in this way.
Permissions of symlinks shouldn't be changed, changing owner is usually
pointless to, but just to be sure we pick the easy way out and use
lchown, check for symlinks before chmod/utimes.
Reported-By: Mattia Rizzolo on IRC
|
|
If other logs can't be written this is a warning to,
so for consistency sake translate the errors to warnings.
|
|
Unlikely to happen in practice and I wonder more how I could miss these
in earlier reviews, but okay, lets fix it for consistency now.
|
|
If libapt has builtin support for a compression type it will create a
dummy compressor struct with the Binary set to 'false' as it will catch
these before using the generic pipe implementation which uses the
Binary. The catching happens based on configured Names through, so you
can actually force apt to use the external binaries even if it would
usually use the builtin support. That logic fails through if you don't
happen to have these external binaries installed as it will fallback to
calling 'false', which will end in confusing 'Write error's.
So, this is again something you only encounter in constructed testing.
Gbp-Dch: Ignore
|
|
This is in so far pointless as the first match will deal with the
extension, so we don't actually ever use these second instances –
probably for the better as most need arguments to behave as epected &
more importantly: the point of the exercise disabling their use for
testing proposes.
Gbp-Dch: Ignore
|
|
All apt versions support numeric as well as 3-character timezones just
fine and its actually hard to write code which doesn't "accidently"
accepts it. So why change? Documenting the Date/Valid-Until fields in
the Release file is easy to do in terms of referencing the
datetime format used e.g. in the Debian changelogs (policy §4.4). This
format specifies only the numeric timezones through, not the nowadays
obsolete 3-character ones, so in the interest of least surprise we should
use the same format even through it carries a small risk of regression
in other clients (which encounter repositories created with
apt-ftparchive).
In case it is really regressing in practice, the hidden option
-o APT::FTPArchive::Release::NumericTimezone=0
can be used to go back to good old UTC as timezone.
The EDSP and EIPP protocols use this 'new' format, the text interface
used to communicate with the acquire methods does not for compatibility
reasons even if none of our methods would be effected and I doubt any
other would (in these instances the timezone is 'GMT' as that is what
HTTP/1.1 requires). Note that this is only true for apt talking to
methods, (libapt-based) methods talking to apt will respond with the
'new' format. It is therefore strongly adviced to support both also in
method input.
|
|
As the volatile sources are parsed last they were sorted behind the
dpkg/status file and hence are treated as a downgrade, which isn't
really what you want to happen as from a user POV its an upgrade.
|
|
If we have a (e.g. locally built) deb file installed and do try to
install it again apt complained about this being a downgrade, but it
wasn't as it is the very same version… it was just confused into not
merging the versions together which looks like a downgrade then.
The same size assumption is usually good, but given that volatile files
are parsed last (even after the status file) the base assumption no
longer holds, but is easy to adept without actually changing anything in
practice.
|
|
Traditionally all providers are protected providing something as apt
can't know which of them is actually really providing the functionality
for the user ensuring that we don't propose the removal of used stuff,
but that is of course also keeping stuff around which could be removed.
That can cause the collection of multiple old providers until the
provided package is itself no longer needed (e.g. out-of-tree kernel
modules). We combat this by marking providers only from the newest
source package version so that old providers built by older versions of
the same source package can be garbage collected.
|
|
As the previous commit, this shouldn't change behavior at all, but
beside being more explicit and perhaps faster its also considerably
shorter (granted, mostly by if0-block elimination).
Gbp-Dch: Ignore
|
|
Piling everything in a single if statement always made my head wobble,
but it hasn't even a benefit as the most common case of a package which
isn't installed passes all of the old if and lands in the non-existent
else-part of the inner if. So beside a subjective cleanup of what goes
on this implementation should also be a bit faster.
No change in behavior should be present.
Gbp-Dch: Ignore
|
|
Writing first means that even in the event of a power-failure the
autobit is saved for future processing instead of "forgotten" so that
the package is treated as manually installed.
In some cases we have to re-run the writing after dpkg is done through
as dpkg can let packages disappear and in such cases apt will move
autobits around (or in that case non-autobits) which we need to store.
|
|
If we can't read the old file we can't just move forward as that would
discard potentially discard old data (especially other fields). We let
it fail only after we are done writing the new file so a user has the
chance to look into and merge the new data (which is otherwise
discarded).
|
|
We deploy atomic renames for some files, but these renames also happen
if something about the file failed which isn't really the point of the
exercise…
Closes: 828908
|
|
This reverts commit 2b8221d66a8284042fc53c7bbb14bb9750e9137f.
Avoiding the use of GCC >= 5 stuff lets use go back to 4.8 simplifying
the travis setup again as well as reducing the backport requirements in
general.
This is possible because the std::get_time use requiring GCC >= 5 in
9febc2b238e1e322dce1f94ecbed46d595893b52 was replaced by handrolling it
in 1d742e01470bba27715a8191c50adde4b39c2f19, so the remaining uses are
just small conviniences we can do without.
Gbp-Dch: Ignore
|
|
Usually these config options are set to sensible values, but if init
isn't run or the user interferes with configuration clearing or similar
the options could indeed carry an empty value, which will result in
FindDir returning a '/'. That feels kinda wrong, but as a public
interface there isn't much we can do about it and instead make it so
that we get the special file /dev/null back we know how to deal with in
such cases.
|
|
Julian noticed on IRC that I fall victim to a lovely false friend by
calling referring to a 'planer' all the time even through these are
machines to e.g. remove splinters from woodwork ("make stuff plane").
The term I meant is written in german in this way (= with a single n)
but in english there are two, aka: 'planner'.
As that is unreleased code switching all instances without any
transitional provisions. Also the reason why its skipped in changelog.
Thanks: Julian Andres Klode
Gbp-Dch: Ignore
|
|
Needed for the previous change
|
|
If a package file is formatted in a way that that no space
follows a deprecated "<", we would reformat it to "<=" and
increase the length of the output by 1, which can break.
Under normal circumstances with "<=" this should not be an
issue.
Closes: #828812
|
|
In 385d9f2f23057bc5808b5e013e77ba16d1c94da4 I implemented the storage of
scenario files based on enabling this by default for EIPP, but I
implemented it first optionally for EDSP to have it independent.
The reasons mentioned in the earlier commit (debugging and bugreports)
obviously apply here, especially as EIPP solutions aren't user approved,
nearly impossible to verify before starting the execution and at the
time of error the scenario has changed already, so that reproducing the
issue becomes hard(er).
|
|
Freeing 'Install' for future use as an interface for "dpkg --install",
which is currently not used by any existent planer, so the
implementation of it itself will be delayed until then.
|
|
A rather special need option, but the internal planer supports this and
we have a testcase for it & sometimes it is hit (as a bug through). The
option itself mostly serves as a reminder for implementors that they
should be careful with removes and especially temporary removes if they
perform any.
|
|
APT has 3 modes: no immediate configuration, all packages are configured
immediately and its default mode of configuring essentials and
pseudo-essentials immediately only. While this seems like a job of
different planers at first, it might be handy to have it as an option,
too, in case a planer (like apts internal one) supports different modes
where the introduction of individual planers would be counter intuitive.
|
|
The generation of the EIPP request was a bit to strict not generation
what would actually be needed to be part of the scenario.
|
|
For the order there is no inherent difference between delete or purge,
so we don't tell the planer about this and instead decide in apt if a
package should be purged or not which also allows us to not tell the
planer about rc-only purges as we can trivially do this on our on as
there is no need to plan such purges.
|
|
Testing the current implementation can benefit from being able to be
feed an EIPP request and produce a fully compliant response. It is also
a great test for EIPP in general.
|
|
We can trim generation time and size of the EIPP scenario considerable
if we we avoid telling the planers about "uninteresting" packages.
This is one of the simpler but already very effective reductions:
Do not tell planers about versions which are neither installed nor are
to be installed as they have no effect on the plan we don't need to tell
the planer about them. EDSP solvers need to know about all versions for
better choice and error messages, but planers really don't.
Git-Dch: Ignore
|
|
The very first step in introducing the "external installation planer
protocol" (short: EIPP) as part of my GSoC2016 project.
The description reads: APT-based tools like apt-get, aptitude, synaptic,
… work with the user to figure out how their system should look like
after they are done installing/removing packages and their dependencies.
The actual installation/removal of packages is done by dpkg with the
constrain that dependencies must be fulfilled at any point in time (e.g.
to run maintainer scripts).
Historically APT has a super micro-management approach to this task
which hasn't aged that well over the years mostly ignoring changes in
dpkg and growing into an unmaintainable mess hardly anyone can debug and
everyone fears to touch – especially as more and more requirements are
tacked onto it like handling cycles and triggers, dealing with
"important" packages first, package sources on removable media, touch
minimal groups to be able to interrupt the process if needed (e.g.
unattended-upgrades) which not only sky-rocket complexity but also can
be mutually exclusive as you e.g. can't have minimal groups and minimal
trigger executions at the same time.
|
|
In 3bdff17c894d0c3d0f813d358fc45d7a263f3552 we did it for the datetime
parsing, but we use the same style in the parsing for pdiff (where the
size of the file is in the middle of the three fields) so imbueing here
as well is a good idea.
|
|
Rewritten in 9febc2b238e1e322dce1f94ecbed46d595893b52 for c++ locales
usage and rewritten again in 1d742e01470bba27715a8191c50adde4b39c2f19 to
avoid a currently present stdlibc++6 bug in the std::get_time
implementation. The later implementation uses still stringstreams for
parsing, but forgot to explicitly reset the locale to something sane
(for parsing english dates that is), so date and especially the parsing
of a number is depending on the locale. Turns out, the French (among
others) format their numbers with space as thousand separator so for
some reason the stdlibc++6 thinks its a good idea to interpret the
entire datetime string as a single number instead of realizing that in
"25 Jun …" the later parts can't reasonably be part of that number even
through there are spaces there…
Workaround is hence: LC_NUMERIC=C.UTF-8
Closes: 828011
|
|
Weak had no dedicated option before and Insecure and Downgrade were both
global options, which given the effect they all have on security is
rather bad. Setting them for individual repositories only isn't great
but at least slightly better and also more consistent with other
settings for repositories.
|
|
Filesize is a silly hash all by itself, but in combination with others
it can be a strong opponent, so ensuring that it is in the list of
hashes and hence checked by the normal course of action the acquire
process takes is a good thing.
|
|
For "Hash Sum mismatch" that info doesn't make a whole lot of
difference, but for the new insufficient info message an indicator that
while this hashes are there and even match, they aren't enough from a
security standpoint.
|
|
Downloading and saying "Hash Sum mismatch" isn't very friendly from a
user POV, so with this change we try to detect such cases early on and
report it, preferably before download even started.
Closes: 827758
|
|
With this commit all APT-based clients default to refusing to work with
unsigned or otherwise insufficently secured repositories. In terms of
apt and apt-get this changes nothing, but it effects all tools using
libapt like aptitude, synaptic or packagekit.
The exception remains apt-get for stretch for now as this might break
too many scripts/usecases too quickly.
The documentation is updated and extended to reflect how to opt out or
in on this behaviour change.
Closes: 808367
|
|
Handling the extra check (and force requirement) for downgrades in
security in our AllowInsecureRepositories checker helps in having this
check everywhere instead of just in the most common place and requiring
a little extra force in such cases is always good.
|
|
APT can be forced to deal with repositories which have no security
features whatsoever, so just giving up on repositories which "just" fail
our current criteria of good security features is the wrong incentive.
Of course, repositories are better of fixing their setup to provide the
minimum of security features, but sometimes this isn't possible:
Historic repositories for example which do not change (anymore).
That also fixes problem with repositories which are marked as trusted,
but are providing only weak security features which would fail the
parsing of the Release file.
Closes: 827364
|
|
Unsecure repositories result in error messages by default which causes
the acquire run to fail hard, but non-failing repositories are still
updated just like in the slightly less hard-failures which got this
behaviour in 35664152e47a1d4d712fd52e0f0a2dc8ed359d32.
|
|
There is a subtile difference between an empty setting and "DIRECT" in
the configuration as the later overrides the generic settings while the
earlier does not. Also, non-zero exitcodes should really be reported as
an error rather than silently discarded.
|
|
Regression introduced in 8f858d560e3b7b475c623c4e242d1edce246025a.
Commands are probably better of always having output through as the
fall through to the generic proxy settings is likely not intended. As
documenting and implementing this more consistently is kind of a
regression through, it is split off into the next commit.
Closes: 827713
|
|
As reported upstream in
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71556
the implementation of std::get_time is currently not as accepting as
strptime is, especially in how hours should be formatted.
Just reverting 9febc2b238e1e322dce1f94ecbed46d595893b52 would be
possible, but then we would reopen the problems fixed by it, so instead
I opted here for a rewrite of the parsing logic which makes this method
a lot longer, but at least it provides the same benefits as the rewrite
in std::get_time was intended to give us and decouples us from the fix
of the issue in the standard library implementation of GCC.
LP: 1593583
|
|
Merging by URI means that having sources lines with different URI
methods results in 'strange' warning and error messages, which aren't
very friendly from a user point of view as not encoding the method in
the filename is effectivly an implementation detail.
Merging by filename removes these messages and makes everything "work"
even if it isn't working the way it is configured as the indexes aren't
acquired over the method given, but over the first method for this
release file (which argueably is an implementation detail stemming from
the filename encoding, too).
So either direction isn't perfectly "right", but personally I prefer
"magic" over strange error messages (and doing a full-circle detection
of this with its own messages which would need to be translated feels
like way too much effort for dubious gain).
Closes: 826944
|
|
We usually use absolute paths to specific the location of dpkg, apt-key
and the like, but there is nothing wrong with using just the command
name and instead let exec(3) make the lookup in PATH.
We had a wild mixture before, so opting for the more accepting option
out of the two seems about right especially as it makes no difference in
the default case as apt uses absolute paths.
|
|
Just closing the fd would be enough, but while we are at it we can also
use the Popen interface to have an easier time with this.
|
|
Not a big deal to leak fds in debugging mode, but for completeness.
Git-Dch: Ignore
|
|
|
|
Seen first in #826783, but as this buglog also shows leaked uncompressed
files as well we don't close it just yet.
|
|
This effects only compressors configured on the fly (rather then the
inbuilt ones as they use a library).
|
|
The comment says it should have been removed with Lenny+1 which is a
small while ago already, so it seems like a good time now…
And as this is a cleanup commit it also gets right of spurious
whitespace at the end of lines, adds missing fold markers and similar
busy work.
Git-Dch: Ignore
|
|
We had an old FIXME saying that it is probably pointless to do this if
we limit by length of the commandline already and I completely agree.
The splitting is bad enough if it must be done, so we should only do it
if we have to (as in absolute length of commandline) and, but that is
just a remark, it is unlikely that we ever have/had a call triggering
this as the default value was ~32000 items…
|