bitbake.conf/package: Collapse PKGDATA_DIR into a single machine specific directory

Currently we have a hierarchy of pkgdata directories and the code has to put together
a search path and look through each in turn until it finds the data it needs.

This has lead to a number of hardcoded paths and file globing which
is unpredictable and undesirable. Worse, certain tricks that should be
easy like a GL specific package architecture become problematic with the
curretn search paths.

With the modern sstate code, we can do better and construct a single pkgdata
directory for each machine in just the same way as we do for the sysroot. This
is already tried and well tested. With such a single directory, all the code that
iterated through multiple pkgdata directories and simply be removed and give
a significant simplification of the code. Even existing build directories adapt
to the change well since the package contents doesn't change, just the location
they're installed to and the stamp for them.

The only complication is the we need a different shlibs directory for each
multilib. These are only used by package.bbclass and the simple fix is to
add MLPREFIX to the shlib directory name. This means the multilib packages will
repackage and the sstate checksum will change but an existing build directory
will adapt to the changes safely.

It is close to release however I believe the benefits this patch give us
are worth consideration for inclusion and give us more options for dealing
with problems like the GL one. It also sets the ground work well for
shlibs improvements in 1.6.

(From OE-Core rev: 1b8e4abd2d9c0901d38d89d0f944fe1ffd019379)

Signed-off-by: Richard Purdie <richard.purdie@linuxfoundation.org>
This commit is contained in:
Richard Purdie
2013-09-13 13:35:31 +01:00
parent af5b3f3510
commit 8ebe7be3d9
14 changed files with 49 additions and 105 deletions

View File

@@ -23,21 +23,7 @@ def read_pkgdatafile(fn):
return pkgdata
def all_pkgdatadirs(d):
dirs = []
triplets = (d.getVar("PKGMLTRIPLETS") or "").split()
for t in triplets:
dirs.append(t + "/runtime/")
return dirs
def get_subpkgedata_fn(pkg, d):
dirs = all_pkgdatadirs(d)
pkgdata = d.expand('${TMPDIR}/pkgdata/')
for dir in dirs:
fn = pkgdata + dir + pkg
if os.path.exists(fn):
return fn
return d.expand('${PKGDATA_DIR}/runtime/%s' % pkg)
def has_subpkgdata(pkg, d):
@@ -70,29 +56,24 @@ def read_subpkgdata_dict(pkg, d):
def _pkgmap(d):
"""Return a dictionary mapping package to recipe name."""
target_os = d.getVar("TARGET_OS", True)
target_vendor = d.getVar("TARGET_VENDOR", True)
basedir = os.path.dirname(d.getVar("PKGDATA_DIR", True))
dirs = ("%s%s-%s" % (arch, target_vendor, target_os)
for arch in d.getVar("PACKAGE_ARCHS", True).split())
pkgdatadir = d.getVar("PKGDATA_DIR", True)
pkgmap = {}
for pkgdatadir in (os.path.join(basedir, sys) for sys in dirs):
try:
files = os.listdir(pkgdatadir)
except OSError:
bb.warn("No files in %s?" % pkgdatadir)
files = []
for pn in filter(lambda f: not os.path.isdir(os.path.join(pkgdatadir, f)), files):
try:
files = os.listdir(pkgdatadir)
pkgdata = read_pkgdatafile(os.path.join(pkgdatadir, pn))
except OSError:
continue
for pn in filter(lambda f: not os.path.isdir(os.path.join(pkgdatadir, f)), files):
try:
pkgdata = read_pkgdatafile(os.path.join(pkgdatadir, pn))
except OSError:
continue
packages = pkgdata.get("PACKAGES") or ""
for pkg in packages.split():
pkgmap[pkg] = pn
packages = pkgdata.get("PACKAGES") or ""
for pkg in packages.split():
pkgmap[pkg] = pn
return pkgmap