Age | Commit message (Collapse) | Author |
|
pkgCacheFile's destructor unlocks the system, which is confusing
if you did not open the cachefile with WithLock set. Create a private
data instance that holds the value of WithLock.
This regression was introduced in commit b2e465d6d32d2dc884f58b94acb7e35f671a87fe:
Join with aliencode
Author: jgg
Date: 2001-02-20 07:03:16 GMT
Join with aliencode
by replacing a "Lock" member that was only initialized when the lock
was taken by calls to Lock, UnLock; with the latter also taking place
if the former did not occur.
Regression-Of: b2e465d6d32d2dc884f58b94acb7e35f671a87fe
LP: #1794053
(cherry picked from commit e02297b8e22dae04872fe6fab6dba966de65dbba)
|
|
We used to fail on unreadable config/preferences/sources files, but at
least for sources we didn't in the past and it seems harsh to refuse to
work because of a single file, especially as the error messages are
inconsistent and end up being silly (like suggesting to run apt update
to fix the problem…).
LP: #1701852
|
|
This makes it easier to see which headers includes what.
The changes were done by running
git grep -l '#\s*include' \
| grep -E '.(cc|h)$' \
| xargs sed -i -E 's/(^\s*)#(\s*)include/\1#\2 include/'
To modify all include lines by adding a space, and then running
./git-clang-format.sh.
|
|
The report mentions "apt list --upgradable", but there are others which
have inconsistent behavior ranging from segfaulting to doing something
with the partial (and hence incomplete) data. We had a recent report
about sources.list (#818628), this one mentions prefences, the obvious
next step is conf files… so the testcase is adapted to check for all
three in file and directory versions and run a bunch of commands each
time which should all have more or less the same behavior in such a case
(aka error out).
Closes: 824503
|
|
Otherwise, things will just start failing later down the stack,
because (a) the lazy getters do not check if building was successful
and (b) any further getter call would return the invalid object
anyway.
Also initialize VS in pkgCache to nullptr by default.
Closes: #818628
|
|
Regression introduced in a249b3e6fd798935a02b769149c9791a6fa6ef16, which
in the case of an invalid cache would build the first part unlocked and
later pick up the (still unlocked) cache for further processing, so the
system got never locked and apt would end up complaining about being
unable to release the lock at shutdown.
The far more common case of having a valid cache worked as expected and
hence covered up the problem – especially as tests who would have
noticed it are simulations only, which do not lock.
Closes: 814139
Reported-By: Balint Reczey <balint@balintreczey.hu>
Reported-By: Helmut Grohne <helmut@subdivi.de> on IRC
|
|
It is a bit academic to support values which aren't big enough to fit even
the hashtables without resizing, but cleaning up ensures that we do the
right thing (aka not segfaulting) even if something goes wrong in these
deep layers. You still can't have very very small values through…
Git-Dch: Ignore
|
|
build-dep was implemented by parsing the build-dependencies of a package
and figuring out which packages to install/remove based on this. That
means that for the first level of dependencies build-dep was
implementing its very own resolver with all the benefits (aka: bugs)
this gives us for not using the existing resolver for all levels.
Making this work involves generating a dummy binary package with fitting
Depends and Conflicts and as we can't create them out of thin air the
cache generation needs to be involved so we end up writing a Packages
file which we want to parse – after we have parsed the other Packages
files already. With .dsc/.deb files we could add them before we started
parsing anything.
With a bit of care we can avoid generating too much data we have to
throw away again (as many parts assume that e.g. the count of packages
doesn't change midair), so that on a speed front there shouldn't be
much of a difference, but output can be slightly confusing as if we have
a completely valid cache on disk the "Reading package lists... Done" is
printed two times – but apt is pretty quick about it in that case.
Closes: #137560, #444930, #489911, #583914, #728317, #812173
|
|
If we already have opened a cache, there is no point in having
to open it again.
|
|
Out of memory and similar circumstanzas could cause MMap::Map to fail
and especially the mmap/malloc calls in it. With some additional
checking we can avoid segfaults and similar in such situations – at
least in theory as if this is a real out of memory everything we do to
handle the error could just as well run into a memory problem as well…
But at least in theory (if MMap::Map is made to fail always) we can deal
with it so good that a user actually never sees a failure (as the cache
it tries to load with it fails and is discarded, so that DynamicMMap
takes over and a new one is build) instead of segfaulting.
Closes: 803417
|
|
Unlinking /dev/null is bad, we shouldn't do that. Also, we should print
at least a warning if we tried to unlink a file but didn't manage to
pull it of (ignoring the case were the file is /dev/null or doesn't
exist in the first place).
This got triggered by a relatively unlikely to cause problem in
pkgAcquire::Worker::PrepareFiles which would while temporary
uncompressed files (which are set to keep compressed) figure out that to
files are the same and prepare for sharing by deleting them. Bad move.
That also shows why not printing a warning is a bad idea as this hide
the error for in non-root test runs.
Git-Dch: Ignore
|
|
Our error reporting is historically grown into some kind of mess.
A while ago I implemented stacking for the global error which is used in
this commit now to wrap calls to functions which do not report (all)
errors via return, so that only failures in those calls cause a failure
to propergate down the chain rather than failing if anything
(potentially totally unrelated) has failed at some point in the past.
This way we can avoid stopping the entire acquire process just because a
single source produced an error for example. It also means that after
the acquire process the cache is generated – even if the acquire
process had failures – as we still have the old good data around we can and
should generate a cache for (again).
There are probably more instances of this hiding, but all these looked
like the easiest to work with and fix with reasonable (aka net-positive)
effects.
|
|
The parameter name suggests that it should forbid the building of the
entire cache in memory, but this isn't how it was previously and as
AllowMem is false by default it actually prevents previous usecases from
working like being root and configuring apt to build no caches at all.
This should be fixed at some point to actually work, but that is hard to
pull off as it means switching the default and some callers (including
apt itself) actually did call it explicitly with false in certain
cases for no apparent reason (at least now where it is common to have
enough memory to throw at every problem and even if not is a slow apt
usally better than an apt erroring out).
Closes: 796459
|
|
Currently, this always returns true, but it might start returning
false at some point in the future...
Gbp-Dch: ignore
|
|
Further abstracting our new ShowList allows to use it for containers of
strings as well giving us the option to implement an or-groups display
for the recommends and suggests lists which is a nice trick given that
it also helps with migrating the last remaining other cases of old
ShowList.
|
|
Various small leaks here and there. Nothing particularily big, but still
good to fix. Found by the sanitizers while running our testcases.
Reported-By: gcc -fsanitize
Git-Dch: Ignore
|
|
This partly reverts d059cc2 and fixes bug #753297 in a more
general way by ensuring that CacheFile.BuildDepCache() builds
a pkgPolicy if there isn't one already.
|
|
Beside being a bit cleaner it hopefully also resolves oddball problems
I have with high levels of parallel jobs.
Git-Dch: Ignore
Reported-By: iwyu (include-what-you-use)
|
|
Reported-By: gcc -Wunused-parameter
Git-Dch: Ignore
|
|
initialized in the constructor." messages (no functional change)
|
|
trying to get a list of included files from them
|
|
- clean up lost atomic cachefiles with 'clean' (Closes: #650513)
|
|
|
|
|
|
|
|
invalid in most cases anyway
|
|
by moving Policy to public again (and therefore after SrcList)
|
|
|
|
- split Open() into submethods to be able to build only parts
- make the OpProgress optional in the Cache buildprocess
|
|
Parts dirs are /etc/apt/{sources.list,apt.conf,preferences}.d
(in the default setup)
|
|
|
|
|
|
|
|
as it does not require a cachefile at all
|
|
|
|
for apt-get update like operations for the frontends and also provides
hooks to run stuff in APT::Update::{Pre,Post}-Invoke
|
|
|
|
- move the RunScripts() code into fileutl.{cc,h}
* apt-pkg/cachefile.cc:
- add support for "APT::Update::{Pre,Post}-Invoke" scripts
|
|
- move the code that does the work from apt-get.cc to pkgCacheFile::ListUpdate()
|
|
Author: jgg
Date: 2002-04-27 04:28:04 GMT
'apt-get update' no longer does 'Building Dependency Tree'.
|
|
Author: jgg
Date: 2001-07-01 20:49:08 GMT
Various bug fixes
|
|
Author: jgg
Date: 2001-03-13 06:51:46 GMT
Alfredo's vendor stuff
|
|
Author: jgg
Date: 2001-02-20 07:03:16 GMT
Join with aliencode
|
|
Author: jgg
Date: 1999-06-24 04:06:30 GMT
Various minor bug fixes
|
|
Author: jgg
Date: 1999-05-04 20:09:48 GMT
Fixed a typo
|
|
Author: jgg
Date: 1999-04-19 02:35:38 GMT
Made apt-cache regenerate its cache in memory
|
|
Author: jgg
Date: 1999-04-18 06:36:36 GMT
Support for memory-only caching
|