mirror of
https://git.yoctoproject.org/poky
synced 2026-02-01 14:28:44 +01:00
Compare commits
143 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
27af23e65f | ||
|
|
f735e50c63 | ||
|
|
be3c73bc02 | ||
|
|
0526e01ddf | ||
|
|
a4266b454c | ||
|
|
b1c27ead60 | ||
|
|
504c22b9c9 | ||
|
|
767b28ea55 | ||
|
|
dfeea177d3 | ||
|
|
48363d5a00 | ||
|
|
15b49e25dc | ||
|
|
4cce3e4aba | ||
|
|
6c3cebfe6d | ||
|
|
c7812938cb | ||
|
|
3f377fcc45 | ||
|
|
e90d014a0a | ||
|
|
e7ecb7e61e | ||
|
|
94f0bd12cd | ||
|
|
965d189933 | ||
|
|
9235aec531 | ||
|
|
fb968f87a5 | ||
|
|
5e68ca2ea4 | ||
|
|
b7e951a842 | ||
|
|
cc345014ba | ||
|
|
ed96f96db0 | ||
|
|
26868e8050 | ||
|
|
dac3b30a2b | ||
|
|
fae3759ad7 | ||
|
|
c9b61655e0 | ||
|
|
0497b035a2 | ||
|
|
ba6aac3106 | ||
|
|
9b2c586aad | ||
|
|
abbe518683 | ||
|
|
4f6040ef2c | ||
|
|
a8c43670d9 | ||
|
|
d13dfa3b2d | ||
|
|
a1b04a126e | ||
|
|
96a00c1402 | ||
|
|
bdd3323254 | ||
|
|
e32d893c4c | ||
|
|
3df40af8d0 | ||
|
|
058a9ff749 | ||
|
|
5c8546cca0 | ||
|
|
5c5b56cffc | ||
|
|
1fc37b75cb | ||
|
|
e6bb30d96c | ||
|
|
dbd5778d74 | ||
|
|
eed0d8765e | ||
|
|
8dcc289ee4 | ||
|
|
4a847c8abd | ||
|
|
28b6628f41 | ||
|
|
4792499fa5 | ||
|
|
2e6a6f0598 | ||
|
|
fece3acfb9 | ||
|
|
a15b641c1c | ||
|
|
f10615345e | ||
|
|
425e00fb99 | ||
|
|
4a656cf222 | ||
|
|
8419bb799e | ||
|
|
985a13277d | ||
|
|
b771c50128 | ||
|
|
1a6bf6e4af | ||
|
|
1e4028c5d3 | ||
|
|
f851202b2f | ||
|
|
cc7d4783e7 | ||
|
|
c9de24d3f4 | ||
|
|
58a7160419 | ||
|
|
767ced9fa5 | ||
|
|
5ecc6d0d6f | ||
|
|
c030e463ab | ||
|
|
709f570c82 | ||
|
|
cfbf6fad48 | ||
|
|
650d20107d | ||
|
|
a8a5765fed | ||
|
|
e369448f32 | ||
|
|
d1fe084c03 | ||
|
|
486441be18 | ||
|
|
0eca2b4cb2 | ||
|
|
7c08b602e6 | ||
|
|
970d00de04 | ||
|
|
f058e96728 | ||
|
|
11fdbf2b27 | ||
|
|
bc02fb725b | ||
|
|
5d4b08853e | ||
|
|
c29e8cbb2f | ||
|
|
784f93baf3 | ||
|
|
755ca76f8e | ||
|
|
1e892fb5a0 | ||
|
|
af811fbc0b | ||
|
|
0dc25d42ef | ||
|
|
5a816edcf9 | ||
|
|
f4434bd16e | ||
|
|
e83c7d3056 | ||
|
|
571259cc48 | ||
|
|
ad41126681 | ||
|
|
f2533f35e8 | ||
|
|
dcd1428716 | ||
|
|
d81ab9f844 | ||
|
|
13bf7c1299 | ||
|
|
6d45c9f72d | ||
|
|
5ae465073f | ||
|
|
2de77b3c38 | ||
|
|
f6092be1de | ||
|
|
babe0fa137 | ||
|
|
5647682c2f | ||
|
|
77b92b01ce | ||
|
|
0a4f7521bb | ||
|
|
c1261f843e | ||
|
|
e3a3bdd81f | ||
|
|
a8def2777c | ||
|
|
9884bc2d48 | ||
|
|
2f1b47e416 | ||
|
|
016d00123a | ||
|
|
fdabda6345 | ||
|
|
99dabeb2e9 | ||
|
|
f56a4774a9 | ||
|
|
4a4cdae234 | ||
|
|
7396cef1b9 | ||
|
|
6676fb5e32 | ||
|
|
be11294d17 | ||
|
|
2d93461815 | ||
|
|
1e9d77c3b2 | ||
|
|
3634379cea | ||
|
|
43c4cdb0df | ||
|
|
28c39928d3 | ||
|
|
703eadc55f | ||
|
|
e7134d50f3 | ||
|
|
ddef53b962 | ||
|
|
c1392638ce | ||
|
|
fbd21995ae | ||
|
|
9e21f5b114 | ||
|
|
b5ad96f86b | ||
|
|
33dcf6960b | ||
|
|
5979f64829 | ||
|
|
b0ac293871 | ||
|
|
6915004b18 | ||
|
|
90d45f4264 | ||
|
|
eb1782f715 | ||
|
|
503023dd69 | ||
|
|
1164f70c34 | ||
|
|
31e19a34a5 | ||
|
|
86c9aa8081 | ||
|
|
ef5298eebd |
8
.gitignore
vendored
8
.gitignore
vendored
@@ -6,18 +6,12 @@ pyshtables.py
|
||||
pstage/
|
||||
scripts/oe-git-proxy-socks
|
||||
sources/
|
||||
meta-*/
|
||||
meta-*
|
||||
!meta-skeleton
|
||||
!meta-hob
|
||||
hob-image-*.bb
|
||||
*.swp
|
||||
*.orig
|
||||
*.rej
|
||||
*~
|
||||
!meta-yocto
|
||||
!meta-yocto-bsp
|
||||
bitbake/doc/manual/html/
|
||||
bitbake/doc/manual/pdf/
|
||||
bitbake/doc/manual/txt/
|
||||
bitbake/doc/manual/xhtml/
|
||||
pull-*/
|
||||
|
||||
@@ -1,34 +1,28 @@
|
||||
Poky Hardware README
|
||||
====================
|
||||
|
||||
This file gives details about using Poky with the reference machines
|
||||
supported out of the box. A full list of supported reference target machines
|
||||
can be found by looking in the following directories:
|
||||
|
||||
meta/conf/machine/
|
||||
meta-yocto-bsp/conf/machine/
|
||||
|
||||
If you are in doubt about using Poky/OpenEmbedded with your hardware, consult
|
||||
the documentation for your board/device.
|
||||
This file gives details about using Poky with different hardware reference
|
||||
boards and consumer devices. A full list of target machines can be found by
|
||||
looking in the meta/conf/machine/ directory. If in doubt about using Poky with
|
||||
your hardware, consult the documentation for your board/device.
|
||||
|
||||
Support for additional devices is normally added by creating BSP layers - for
|
||||
more information please see the Yocto Board Support Package (BSP) Developer's
|
||||
Guide - documentation source is in documentation/bspguide or download the PDF
|
||||
from:
|
||||
|
||||
http://yoctoproject.org/documentation
|
||||
http://yoctoproject.org/community/documentation
|
||||
|
||||
Support for physical reference hardware has now been split out into a
|
||||
meta-yocto-bsp layer which can be removed separately from other layers if not
|
||||
needed.
|
||||
Support for machines other than QEMU may be moved out to separate BSP layers in
|
||||
future versions.
|
||||
|
||||
|
||||
QEMU Emulation Targets
|
||||
======================
|
||||
|
||||
To simplify development, the build system supports building images to
|
||||
work with the QEMU emulator in system emulation mode. Several architectures
|
||||
are currently supported:
|
||||
To simplify development Poky supports building images to work with the QEMU
|
||||
emulator in system emulation mode. Several architectures are currently
|
||||
supported:
|
||||
|
||||
* ARM (qemuarm)
|
||||
* x86 (qemux86)
|
||||
@@ -36,33 +30,32 @@ are currently supported:
|
||||
* PowerPC (qemuppc)
|
||||
* MIPS (qemumips)
|
||||
|
||||
Use of the QEMU images is covered in the Yocto Project Reference Manual.
|
||||
The appropriate MACHINE variable value corresponding to the target is given
|
||||
in brackets.
|
||||
Use of the QEMU images is covered in the Poky Reference Manual. The Poky
|
||||
MACHINE setting corresponding to the target is given in brackets.
|
||||
|
||||
|
||||
Hardware Reference Boards
|
||||
=========================
|
||||
|
||||
The following boards are supported by the meta-yocto-bsp layer:
|
||||
The following boards are supported by Poky's core layer:
|
||||
|
||||
* Texas Instruments Beagleboard (beagleboard)
|
||||
* Freescale MPC8315E-RDB (mpc8315e-rdb)
|
||||
* Ubiquiti Networks RouterStation Pro (routerstationpro)
|
||||
|
||||
For more information see the board's section below. The appropriate MACHINE
|
||||
variable value corresponding to the board is given in brackets.
|
||||
For more information see the board's section below. The Poky MACHINE setting
|
||||
corresponding to the board is given in brackets.
|
||||
|
||||
|
||||
Consumer Devices
|
||||
================
|
||||
|
||||
The following consumer devices are supported by the meta-yocto-bsp layer:
|
||||
The following consumer devices are supported by Poky's core layer:
|
||||
|
||||
* Intel Atom based PCs and devices (atom-pc)
|
||||
|
||||
For more information see the device's section below. The appropriate MACHINE
|
||||
variable value corresponding to the device is given in brackets.
|
||||
For more information see the device's section below. The Poky MACHINE setting
|
||||
corresponding to the device is given in brackets.
|
||||
|
||||
|
||||
|
||||
@@ -85,7 +78,7 @@ supports ethernet, wifi, sound, and i915 graphics by default in addition to
|
||||
common PC input devices, busses, and so on.
|
||||
|
||||
Depending on the device, it can boot from a traditional hard-disk, a USB device,
|
||||
or over the network. Writing generated images to physical media is
|
||||
or over the network. Writing poky generated images to physical media is
|
||||
straightforward with a caveat for USB devices. The following examples assume the
|
||||
target boot device is /dev/sdb, be sure to verify this and use the correct
|
||||
device as the following commands are run as root and are not reversable.
|
||||
@@ -138,7 +131,7 @@ USB Device:
|
||||
device stops flashing, remove and reinsert the device to allow the
|
||||
kernel to detect the new partition layout.
|
||||
|
||||
c. Copy the contents of the image to the USB-ZIP mode device:
|
||||
c. Copy the contents of the poky image to the USB-ZIP mode device:
|
||||
|
||||
# mkdir /tmp/image
|
||||
# mkdir /tmp/usbkey
|
||||
@@ -288,8 +281,8 @@ anything here.
|
||||
Load the kernel and dtb (device tree blob), and boot the system as follows:
|
||||
|
||||
1. Get the kernel (uImage-mpc8315e-rdb.bin) and dtb (uImage-mpc8315e-rdb.dtb)
|
||||
files from the tmp/deploy directory, and make them available on your TFTP
|
||||
server.
|
||||
files from the Poky build tmp/deploy directory, and make them available on
|
||||
your TFTP server.
|
||||
|
||||
2. Connect the board's first serial port to your workstation and then start up
|
||||
your favourite serial terminal so that you will be able to interact with
|
||||
@@ -308,9 +301,9 @@ Load the kernel and dtb (device tree blob), and boot the system as follows:
|
||||
|
||||
5. Download the kernel and dtb, and boot:
|
||||
|
||||
=> tftp 1000000 uImage-mpc8315e-rdb.bin
|
||||
=> tftp 2000000 uImage-mpc8315e-rdb.dtb
|
||||
=> bootm 1000000 - 2000000
|
||||
=> tftp 800000 uImage-mpc8315e-rdb.bin
|
||||
=> tftp 780000 uImage-mpc8315e-rdb.dtb
|
||||
=> bootm 800000 - 780000
|
||||
|
||||
|
||||
Ubiquiti Networks RouterStation Pro (routerstationpro)
|
||||
|
||||
@@ -40,7 +40,7 @@ from bb import cooker
|
||||
from bb import ui
|
||||
from bb import server
|
||||
|
||||
__version__ = "1.18.0"
|
||||
__version__ = "1.16.0"
|
||||
logger = logging.getLogger("BitBake")
|
||||
|
||||
# Unbuffer stdout to avoid log truncation in the event
|
||||
@@ -214,10 +214,9 @@ Default BBFILES are the .bb files in the current directory.""")
|
||||
if configuration.bind and configuration.servertype != "xmlrpc":
|
||||
sys.exit("FATAL: If '-B' or '--bind' is defined, we must set the servertype as 'xmlrpc'.\n")
|
||||
|
||||
if "BBDEBUG" in os.environ:
|
||||
level = int(os.environ["BBDEBUG"])
|
||||
if level > configuration.debug:
|
||||
configuration.debug = level
|
||||
# Save a logfile for cooker into the current working directory. When the
|
||||
# server is daemonized this logfile will be truncated.
|
||||
cooker_logfile = os.path.join(os.getcwd(), "cooker.log")
|
||||
|
||||
bb.msg.init_msgconfig(configuration.verbose, configuration.debug,
|
||||
configuration.debug_domains)
|
||||
@@ -229,8 +228,10 @@ Default BBFILES are the .bb files in the current directory.""")
|
||||
# Before we start modifying the environment we should take a pristine
|
||||
# copy for possible later use
|
||||
initialenv = os.environ.copy()
|
||||
# Clear away any spurious environment variables while we stoke up the cooker
|
||||
cleanedvars = bb.utils.clean_environment()
|
||||
# Clear away any spurious environment variables. But don't wipe the
|
||||
# environment totally. This is necessary to ensure the correct operation
|
||||
# of the UIs (e.g. for DISPLAY, etc.)
|
||||
bb.utils.clean_environment()
|
||||
|
||||
server = server.BitBakeServer()
|
||||
if configuration.bind:
|
||||
@@ -245,7 +246,7 @@ Default BBFILES are the .bb files in the current directory.""")
|
||||
|
||||
server.addcooker(cooker)
|
||||
server.saveConnectionDetails()
|
||||
server.detach()
|
||||
server.detach(cooker_logfile)
|
||||
|
||||
# Should no longer need to ever reference cooker
|
||||
del cooker
|
||||
@@ -256,10 +257,6 @@ Default BBFILES are the .bb files in the current directory.""")
|
||||
# Setup a connection to the server (cooker)
|
||||
server_connection = server.establishConnection()
|
||||
|
||||
# Restore the environment in case the UI needs it
|
||||
for k in cleanedvars:
|
||||
os.environ[k] = cleanedvars[k]
|
||||
|
||||
try:
|
||||
return server.launchUI(ui_main, server_connection.connection, server_connection.events)
|
||||
finally:
|
||||
|
||||
@@ -85,8 +85,8 @@ options, args = parser.parse_args(sys.argv)
|
||||
if len(args) == 1:
|
||||
parser.print_help()
|
||||
else:
|
||||
tinfoil = bb.tinfoil.Tinfoil()
|
||||
if options.taskmode:
|
||||
tinfoil = bb.tinfoil.Tinfoil()
|
||||
if len(args) < 3:
|
||||
logger.error("Please specify a recipe and task name")
|
||||
sys.exit(1)
|
||||
|
||||
@@ -26,7 +26,6 @@ import os
|
||||
import sys
|
||||
import fnmatch
|
||||
from collections import defaultdict
|
||||
import re
|
||||
|
||||
bindir = os.path.dirname(__file__)
|
||||
topdir = os.path.dirname(bindir)
|
||||
@@ -73,7 +72,7 @@ class Commands(cmd.Cmd):
|
||||
else:
|
||||
sys.stdout.write("usage: bitbake-layers <command> [arguments]\n\n")
|
||||
sys.stdout.write("Available commands:\n")
|
||||
procnames = list(set(self.get_names()))
|
||||
procnames = self.get_names()
|
||||
for procname in procnames:
|
||||
if procname[:3] == 'do_':
|
||||
sys.stdout.write(" %s\n" % procname[3:].replace('_', '-'))
|
||||
@@ -459,25 +458,10 @@ build results (as the layer priority order has effectively changed).
|
||||
for layer, _, regex, _ in self.bbhandler.cooker.status.bbfile_config_priorities:
|
||||
if regex.match(filename):
|
||||
for layerdir in self.bblayers:
|
||||
if regex.match(os.path.join(layerdir, 'test')) and re.match(layerdir, filename):
|
||||
if regex.match(os.path.join(layerdir, 'test')):
|
||||
return self.get_layer_name(layerdir)
|
||||
return "?"
|
||||
|
||||
def get_file_layerdir(self, filename):
|
||||
for layer, _, regex, _ in self.bbhandler.cooker.status.bbfile_config_priorities:
|
||||
if regex.match(filename):
|
||||
for layerdir in self.bblayers:
|
||||
if regex.match(os.path.join(layerdir, 'test')) and re.match(layerdir, filename):
|
||||
return layerdir
|
||||
return "?"
|
||||
|
||||
def remove_layer_prefix(self, f):
|
||||
"""Remove the layer_dir prefix, e.g., f = /path/to/layer_dir/foo/blah, the
|
||||
return value will be: layer_dir/foo/blah"""
|
||||
f_layerdir = self.get_file_layerdir(f)
|
||||
prefix = os.path.join(os.path.dirname(f_layerdir), '')
|
||||
return f[len(prefix):] if f.startswith(prefix) else f
|
||||
|
||||
def get_layer_name(self, layerdir):
|
||||
return os.path.basename(layerdir.rstrip(os.sep))
|
||||
|
||||
@@ -557,164 +541,6 @@ Recipes are listed with the bbappends that apply to them as subitems.
|
||||
notappended.append(basename)
|
||||
return appended, notappended
|
||||
|
||||
def do_show_cross_depends(self, args):
|
||||
"""figure out the dependency between recipes that crosses a layer boundary.
|
||||
|
||||
usage: show-cross-depends [-f]
|
||||
|
||||
Figure out the dependency between recipes that crosses a layer boundary.
|
||||
|
||||
Options:
|
||||
-f show full file path
|
||||
|
||||
NOTE:
|
||||
The .bbappend file can impact the dependency.
|
||||
"""
|
||||
self.bbhandler.prepare()
|
||||
|
||||
show_filenames = False
|
||||
for arg in args.split():
|
||||
if arg == '-f':
|
||||
show_filenames = True
|
||||
else:
|
||||
sys.stderr.write("show-cross-depends: invalid option %s\n" % arg)
|
||||
self.do_help('')
|
||||
return
|
||||
|
||||
pkg_fn = self.bbhandler.cooker_data.pkg_fn
|
||||
bbpath = str(self.bbhandler.config_data.getVar('BBPATH', True))
|
||||
self.require_re = re.compile(r"require\s+(.+)")
|
||||
self.include_re = re.compile(r"include\s+(.+)")
|
||||
self.inherit_re = re.compile(r"inherit\s+(.+)")
|
||||
|
||||
# The bb's DEPENDS and RDEPENDS
|
||||
for f in pkg_fn:
|
||||
f = bb.cache.Cache.virtualfn2realfn(f)[0]
|
||||
# Get the layername that the file is in
|
||||
layername = self.get_file_layer(f)
|
||||
|
||||
# The DEPENDS
|
||||
deps = self.bbhandler.cooker_data.deps[f]
|
||||
for pn in deps:
|
||||
if pn in self.bbhandler.cooker_data.pkg_pn:
|
||||
best = bb.providers.findBestProvider(pn,
|
||||
self.bbhandler.cooker.configuration.data,
|
||||
self.bbhandler.cooker_data,
|
||||
self.bbhandler.cooker_data.pkg_pn)
|
||||
self.check_cross_depends("DEPENDS", layername, f, best[3], show_filenames)
|
||||
|
||||
# The RDPENDS
|
||||
all_rdeps = self.bbhandler.cooker_data.rundeps[f].values()
|
||||
# Remove the duplicated or null one.
|
||||
sorted_rdeps = {}
|
||||
# The all_rdeps is the list in list, so we need two for loops
|
||||
for k1 in all_rdeps:
|
||||
for k2 in k1:
|
||||
sorted_rdeps[k2] = 1
|
||||
all_rdeps = sorted_rdeps.keys()
|
||||
for rdep in all_rdeps:
|
||||
all_p = bb.providers.getRuntimeProviders(self.bbhandler.cooker_data, rdep)
|
||||
if all_p:
|
||||
best = bb.providers.filterProvidersRunTime(all_p, rdep,
|
||||
self.bbhandler.cooker.configuration.data,
|
||||
self.bbhandler.cooker_data)[0][0]
|
||||
self.check_cross_depends("RDEPENDS", layername, f, best, show_filenames)
|
||||
|
||||
# The inherit class
|
||||
cls_re = re.compile('classes/')
|
||||
if f in self.bbhandler.cooker_data.inherits:
|
||||
inherits = self.bbhandler.cooker_data.inherits[f]
|
||||
for cls in inherits:
|
||||
# The inherits' format is [classes/cls, /path/to/classes/cls]
|
||||
# ignore the classes/cls.
|
||||
if not cls_re.match(cls):
|
||||
inherit_layername = self.get_file_layer(cls)
|
||||
if inherit_layername != layername:
|
||||
if not show_filenames:
|
||||
f_short = self.remove_layer_prefix(f)
|
||||
cls = self.remove_layer_prefix(cls)
|
||||
else:
|
||||
f_short = f
|
||||
logger.plain("%s inherits %s" % (f_short, cls))
|
||||
|
||||
# The 'require/include xxx' in the bb file
|
||||
pv_re = re.compile(r"\${PV}")
|
||||
fnfile = open(f, 'r')
|
||||
line = fnfile.readline()
|
||||
while line:
|
||||
m, keyword = self.match_require_include(line)
|
||||
# Found the 'require/include xxxx'
|
||||
if m:
|
||||
needed_file = m.group(1)
|
||||
# Replace the ${PV} with the real PV
|
||||
if pv_re.search(needed_file) and f in self.bbhandler.cooker_data.pkg_pepvpr:
|
||||
pv = self.bbhandler.cooker_data.pkg_pepvpr[f][1]
|
||||
needed_file = re.sub(r"\${PV}", pv, needed_file)
|
||||
self.print_cross_files(bbpath, keyword, layername, f, needed_file, show_filenames)
|
||||
line = fnfile.readline()
|
||||
fnfile.close()
|
||||
|
||||
# The "require/include xxx" in conf/machine/*.conf, .inc and .bbclass
|
||||
conf_re = re.compile(".*/conf/machine/[^\/]*\.conf$")
|
||||
inc_re = re.compile(".*\.inc$")
|
||||
# The "inherit xxx" in .bbclass
|
||||
bbclass_re = re.compile(".*\.bbclass$")
|
||||
for layerdir in self.bblayers:
|
||||
layername = self.get_layer_name(layerdir)
|
||||
for dirpath, dirnames, filenames in os.walk(layerdir):
|
||||
for name in filenames:
|
||||
f = os.path.join(dirpath, name)
|
||||
s = conf_re.match(f) or inc_re.match(f) or bbclass_re.match(f)
|
||||
if s:
|
||||
ffile = open(f, 'r')
|
||||
line = ffile.readline()
|
||||
while line:
|
||||
m, keyword = self.match_require_include(line)
|
||||
# Only bbclass has the "inherit xxx" here.
|
||||
bbclass=""
|
||||
if not m and f.endswith(".bbclass"):
|
||||
m, keyword = self.match_inherit(line)
|
||||
bbclass=".bbclass"
|
||||
# Find a 'require/include xxxx'
|
||||
if m:
|
||||
self.print_cross_files(bbpath, keyword, layername, f, m.group(1) + bbclass, show_filenames)
|
||||
line = ffile.readline()
|
||||
ffile.close()
|
||||
|
||||
def print_cross_files(self, bbpath, keyword, layername, f, needed_filename, show_filenames):
|
||||
"""Print the depends that crosses a layer boundary"""
|
||||
needed_file = bb.utils.which(bbpath, needed_filename)
|
||||
if needed_file:
|
||||
# Which layer is this file from
|
||||
needed_layername = self.get_file_layer(needed_file)
|
||||
if needed_layername != layername:
|
||||
if not show_filenames:
|
||||
f = self.remove_layer_prefix(f)
|
||||
needed_file = self.remove_layer_prefix(needed_file)
|
||||
logger.plain("%s %s %s" %(f, keyword, needed_file))
|
||||
def match_inherit(self, line):
|
||||
"""Match the inherit xxx line"""
|
||||
return (self.inherit_re.match(line), "inherits")
|
||||
|
||||
def match_require_include(self, line):
|
||||
"""Match the require/include xxx line"""
|
||||
m = self.require_re.match(line)
|
||||
keyword = "requires"
|
||||
if not m:
|
||||
m = self.include_re.match(line)
|
||||
keyword = "includes"
|
||||
return (m, keyword)
|
||||
|
||||
def check_cross_depends(self, keyword, layername, f, needed_file, show_filenames):
|
||||
"""Print the DEPENDS/RDEPENDS file that crosses a layer boundary"""
|
||||
best_realfn = bb.cache.Cache.virtualfn2realfn(needed_file)[0]
|
||||
needed_layername = self.get_file_layer(best_realfn)
|
||||
if needed_layername != layername:
|
||||
if not show_filenames:
|
||||
f = self.remove_layer_prefix(f)
|
||||
best_realfn = self.remove_layer_prefix(best_realfn)
|
||||
|
||||
logger.plain("%s %s %s" % (f, keyword, best_realfn))
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main(sys.argv[1:]) or 0)
|
||||
|
||||
@@ -28,10 +28,8 @@ import gtk
|
||||
import optparse
|
||||
import pygtk
|
||||
|
||||
from bb.ui.crumbs.hig import DeployImageDialog, ImageSelectionDialog, CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hobwidget import HobAltButton, HobButton
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hig.deployimagedialog import DeployImageDialog
|
||||
from bb.ui.crumbs.hig.imageselectiondialog import ImageSelectionDialog
|
||||
|
||||
# I put all the fs bitbake supported here. Need more test.
|
||||
DEPLOYABLE_IMAGE_TYPES = ["jffs2", "cramfs", "ext2", "ext3", "btrfs", "squashfs", "ubi", "vmdk"]
|
||||
|
||||
@@ -10,8 +10,8 @@ if &compatible || version < 600
|
||||
finish
|
||||
endif
|
||||
|
||||
" .bb, .bbappend and .bbclass
|
||||
au BufNewFile,BufRead *.{bb,bbappend,bbclass} set filetype=bitbake
|
||||
" .bb and .bbclass
|
||||
au BufNewFile,BufRead *.b{b,bclass} set filetype=bitbake
|
||||
|
||||
" .inc
|
||||
au BufNewFile,BufRead *.inc set filetype=bitbake
|
||||
|
||||
@@ -103,24 +103,6 @@ Show debug logging for the specified logging domains
|
||||
.TP
|
||||
.B \-P, \-\-profile
|
||||
profile the command and print a report
|
||||
.TP
|
||||
.B \-uUI, \-\-ui=UI
|
||||
User interface to use. Currently, hob, depexp, goggle or ncurses can be specified as UI.
|
||||
.TP
|
||||
.B \-tSERVERTYPE, \-\-servertype=SERVERTYPE
|
||||
Choose which server to use, none, process or xmlrpc.
|
||||
.TP
|
||||
.B \-\-revisions-changed
|
||||
Set the exit code depending on whether upstream floating revisions have changed or not.
|
||||
.TP
|
||||
.B \-\-server-only
|
||||
Run bitbake without UI, the frontend can connect with bitbake server itself.
|
||||
.TP
|
||||
.B \-BBIND, \-\-bind=BIND
|
||||
The name/address for the bitbake server to bind to.
|
||||
.TP
|
||||
.B \-\-no\-setscene
|
||||
Do not run any setscene tasks, forces builds.
|
||||
|
||||
.SH ENVIRONMENT VARIABLES
|
||||
bitbake uses the following environment variables to control its
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
__version__ = "1.18.0"
|
||||
__version__ = "1.16.0"
|
||||
|
||||
import sys
|
||||
if sys.version_info < (2, 6, 0):
|
||||
@@ -74,6 +74,11 @@ logger.setLevel(logging.DEBUG - 2)
|
||||
# can result in construction of the various loggers.
|
||||
import bb.msg
|
||||
|
||||
if "BBDEBUG" in os.environ:
|
||||
level = int(os.environ["BBDEBUG"])
|
||||
if level:
|
||||
bb.msg.set_debug_level(level)
|
||||
|
||||
from bb import fetch2 as fetch
|
||||
sys.modules['bb.fetch'] = sys.modules['bb.fetch2']
|
||||
|
||||
|
||||
@@ -29,7 +29,6 @@ import os
|
||||
import sys
|
||||
import logging
|
||||
import shlex
|
||||
import glob
|
||||
import bb
|
||||
import bb.msg
|
||||
import bb.process
|
||||
@@ -423,27 +422,13 @@ def _exec_task(fn, task, d, quieterr):
|
||||
|
||||
return 0
|
||||
|
||||
def exec_task(fn, task, d, profile = False):
|
||||
def exec_task(fn, task, d):
|
||||
try:
|
||||
quieterr = False
|
||||
if d.getVarFlag(task, "quieterrors") is not None:
|
||||
quieterr = True
|
||||
|
||||
if profile:
|
||||
profname = "profile-%s.log" % (os.path.basename(fn) + "-" + task)
|
||||
try:
|
||||
import cProfile as profile
|
||||
except:
|
||||
import profile
|
||||
prof = profile.Profile()
|
||||
ret = profile.Profile.runcall(prof, _exec_task, fn, task, d, quieterr)
|
||||
prof.dump_stats(profname)
|
||||
bb.utils.process_profilelog(profname)
|
||||
|
||||
return ret
|
||||
else:
|
||||
return _exec_task(fn, task, d, quieterr)
|
||||
|
||||
return _exec_task(fn, task, d, quieterr)
|
||||
except Exception:
|
||||
from traceback import format_exc
|
||||
if not quieterr:
|
||||
@@ -506,11 +491,9 @@ def stamp_cleanmask_internal(taskname, d, file_name):
|
||||
extrainfo = d.getVarFlag(taskflagname, 'stamp-extra-info', True) or ""
|
||||
|
||||
if not stamp:
|
||||
return []
|
||||
return
|
||||
|
||||
cleanmask = bb.parse.siggen.stampcleanmask(stamp, file_name, taskname, extrainfo)
|
||||
|
||||
return [cleanmask, cleanmask.replace(taskflagname, taskflagname + "_setscene")]
|
||||
return bb.parse.siggen.stampcleanmask(stamp, file_name, taskname, extrainfo)
|
||||
|
||||
def make_stamp(task, d, file_name = None):
|
||||
"""
|
||||
@@ -518,16 +501,9 @@ def make_stamp(task, d, file_name = None):
|
||||
(d can be a data dict or dataCache)
|
||||
"""
|
||||
cleanmask = stamp_cleanmask_internal(task, d, file_name)
|
||||
for mask in cleanmask:
|
||||
for name in glob.glob(mask):
|
||||
# Preserve sigdata files in the stamps directory
|
||||
if "sigdata" in name:
|
||||
continue
|
||||
# Preserve taint files in the stamps directory
|
||||
if name.endswith('.taint'):
|
||||
continue
|
||||
os.unlink(name)
|
||||
|
||||
if cleanmask:
|
||||
bb.utils.remove(cleanmask)
|
||||
|
||||
stamp = stamp_internal(task, d, file_name)
|
||||
# Remove the file and recreate to force timestamp
|
||||
# change on broken NFS filesystems
|
||||
@@ -608,10 +584,9 @@ def add_tasks(tasklist, d):
|
||||
getTask('noexec')
|
||||
getTask('umask')
|
||||
task_deps['parents'][task] = []
|
||||
if 'deps' in flags:
|
||||
for dep in flags['deps']:
|
||||
dep = data.expand(dep, d)
|
||||
task_deps['parents'][task].append(dep)
|
||||
for dep in flags['deps']:
|
||||
dep = data.expand(dep, d)
|
||||
task_deps['parents'][task].append(dep)
|
||||
|
||||
# don't assume holding a reference
|
||||
data.setVar('_task_deps', task_deps, d)
|
||||
|
||||
@@ -119,6 +119,7 @@ class CoreRecipeInfo(RecipeInfoCommon):
|
||||
self.basetaskhashes = self.taskvar('BB_BASEHASH', self.tasks, metadata)
|
||||
self.hashfilename = self.getvar('BB_HASHFILENAME', metadata)
|
||||
|
||||
self.file_depends = metadata.getVar('__depends', False)
|
||||
self.task_deps = metadata.getVar('_task_deps', False) or {'tasks': [], 'parents': {}}
|
||||
|
||||
self.skipped = False
|
||||
@@ -126,6 +127,7 @@ class CoreRecipeInfo(RecipeInfoCommon):
|
||||
self.pv = self.getvar('PV', metadata)
|
||||
self.pr = self.getvar('PR', metadata)
|
||||
self.defaultpref = self.intvar('DEFAULT_PREFERENCE', metadata)
|
||||
self.broken = self.getvar('BROKEN', metadata)
|
||||
self.not_world = self.getvar('EXCLUDE_FROM_WORLD', metadata)
|
||||
self.stamp = self.getvar('STAMP', metadata)
|
||||
self.stampclean = self.getvar('STAMPCLEAN', metadata)
|
||||
@@ -232,7 +234,7 @@ class CoreRecipeInfo(RecipeInfoCommon):
|
||||
|
||||
# Collect files we may need for possible world-dep
|
||||
# calculations
|
||||
if not self.not_world:
|
||||
if not self.broken and not self.not_world:
|
||||
cachedata.possible_world.append(fn)
|
||||
|
||||
# create a collection of all targets for sanity checking
|
||||
@@ -403,12 +405,12 @@ class Cache(object):
|
||||
"""Parse the specified filename, returning the recipe information"""
|
||||
infos = []
|
||||
datastores = cls.load_bbfile(filename, appends, configdata)
|
||||
depends = []
|
||||
depends = set()
|
||||
for variant, data in sorted(datastores.iteritems(),
|
||||
key=lambda i: i[0],
|
||||
reverse=True):
|
||||
virtualfn = cls.realfn2virtual(filename, variant)
|
||||
depends = depends + (data.getVar("__depends", False) or [])
|
||||
depends |= (data.getVar("__depends", False) or set())
|
||||
if depends and not variant:
|
||||
data.setVar("__depends", depends)
|
||||
|
||||
@@ -526,7 +528,7 @@ class Cache(object):
|
||||
|
||||
if appends != info_array[0].appends:
|
||||
logger.debug(2, "Cache: appends for %s changed", fn)
|
||||
logger.debug(2, "%s to %s" % (str(appends), str(info_array[0].appends)))
|
||||
bb.note("%s to %s" % (str(appends), str(info_array[0].appends)))
|
||||
self.remove(fn)
|
||||
return False
|
||||
|
||||
|
||||
@@ -41,10 +41,6 @@ class HobRecipeInfo(RecipeInfoCommon):
|
||||
self.license = self.getvar('LICENSE', metadata)
|
||||
self.section = self.getvar('SECTION', metadata)
|
||||
self.description = self.getvar('DESCRIPTION', metadata)
|
||||
self.homepage = self.getvar('HOMEPAGE', metadata)
|
||||
self.bugtracker = self.getvar('BUGTRACKER', metadata)
|
||||
self.prevision = self.getvar('PR', metadata)
|
||||
self.files_info = self.getvar('FILES_INFO', metadata)
|
||||
|
||||
@classmethod
|
||||
def init_cacheData(cls, cachedata):
|
||||
@@ -53,17 +49,9 @@ class HobRecipeInfo(RecipeInfoCommon):
|
||||
cachedata.license = {}
|
||||
cachedata.section = {}
|
||||
cachedata.description = {}
|
||||
cachedata.homepage = {}
|
||||
cachedata.bugtracker = {}
|
||||
cachedata.prevision = {}
|
||||
cachedata.files_info = {}
|
||||
|
||||
def add_cacheData(self, cachedata, fn):
|
||||
cachedata.summary[fn] = self.summary
|
||||
cachedata.license[fn] = self.license
|
||||
cachedata.section[fn] = self.section
|
||||
cachedata.description[fn] = self.description
|
||||
cachedata.homepage[fn] = self.homepage
|
||||
cachedata.bugtracker[fn] = self.bugtracker
|
||||
cachedata.prevision[fn] = self.prevision
|
||||
cachedata.files_info[fn] = self.files_info
|
||||
|
||||
@@ -35,7 +35,7 @@ def check_indent(codestr):
|
||||
|
||||
class CodeParserCache(MultiProcessCache):
|
||||
cache_file_name = "bb_codeparser.dat"
|
||||
CACHE_VERSION = 3
|
||||
CACHE_VERSION = 2
|
||||
|
||||
def __init__(self):
|
||||
MultiProcessCache.__init__(self)
|
||||
@@ -100,8 +100,7 @@ class BufferedLogger(Logger):
|
||||
self.buffer = []
|
||||
|
||||
class PythonParser():
|
||||
getvars = ("d.getVar", "bb.data.getVar", "data.getVar", "d.appendVar", "d.prependVar")
|
||||
containsfuncs = ("bb.utils.contains", "base_contains", "oe.utils.contains")
|
||||
getvars = ("d.getVar", "bb.data.getVar", "data.getVar")
|
||||
execfuncs = ("bb.build.exec_func", "bb.build.exec_task")
|
||||
|
||||
def warn(self, func, arg):
|
||||
@@ -120,7 +119,7 @@ class PythonParser():
|
||||
|
||||
def visit_Call(self, node):
|
||||
name = self.called_node_name(node.func)
|
||||
if name in self.getvars or name in self.containsfuncs:
|
||||
if name in self.getvars:
|
||||
if isinstance(node.args[0], ast.Str):
|
||||
self.var_references.add(node.args[0].s)
|
||||
else:
|
||||
|
||||
@@ -44,9 +44,6 @@ class CommandFailed(CommandExit):
|
||||
self.error = message
|
||||
CommandExit.__init__(self, 1)
|
||||
|
||||
class CommandError(Exception):
|
||||
pass
|
||||
|
||||
class Command:
|
||||
"""
|
||||
A queue of asynchronous commands for bitbake
|
||||
@@ -60,26 +57,21 @@ class Command:
|
||||
self.currentAsyncCommand = None
|
||||
|
||||
def runCommand(self, commandline):
|
||||
command = commandline.pop(0)
|
||||
if hasattr(CommandsSync, command):
|
||||
# Can run synchronous commands straight away
|
||||
command_method = getattr(self.cmds_sync, command)
|
||||
try:
|
||||
result = command_method(self, commandline)
|
||||
except CommandError as exc:
|
||||
return None, exc.args[0]
|
||||
except Exception:
|
||||
import traceback
|
||||
return None, traceback.format_exc()
|
||||
else:
|
||||
return result, None
|
||||
if self.currentAsyncCommand is not None:
|
||||
return None, "Busy (%s in progress)" % self.currentAsyncCommand[0]
|
||||
if command not in CommandsAsync.__dict__:
|
||||
return None, "No such command"
|
||||
self.currentAsyncCommand = (command, commandline)
|
||||
self.cooker.server_registration_cb(self.cooker.runCommands, self.cooker)
|
||||
return True, None
|
||||
try:
|
||||
command = commandline.pop(0)
|
||||
if command in CommandsSync.__dict__:
|
||||
# Can run synchronous commands straight away
|
||||
return getattr(CommandsSync, command)(self.cmds_sync, self, commandline)
|
||||
if self.currentAsyncCommand is not None:
|
||||
return "Busy (%s in progress)" % self.currentAsyncCommand[0]
|
||||
if command not in CommandsAsync.__dict__:
|
||||
return "No such command"
|
||||
self.currentAsyncCommand = (command, commandline)
|
||||
self.cooker.server_registration_cb(self.cooker.runCommands, self.cooker)
|
||||
return True
|
||||
except:
|
||||
import traceback
|
||||
return traceback.format_exc()
|
||||
|
||||
def runAsyncCommand(self):
|
||||
try:
|
||||
@@ -147,13 +139,7 @@ class CommandsSync:
|
||||
"""
|
||||
Get any command parsed from the commandline
|
||||
"""
|
||||
cmd_action = command.cooker.commandlineAction
|
||||
if cmd_action is None:
|
||||
return None
|
||||
elif 'msg' in cmd_action and cmd_action['msg']:
|
||||
raise CommandError(cmd_action['msg'])
|
||||
else:
|
||||
return cmd_action['action']
|
||||
return command.cooker.commandlineAction
|
||||
|
||||
def getVariable(self, command, params):
|
||||
"""
|
||||
@@ -174,18 +160,6 @@ class CommandsSync:
|
||||
value = str(params[1])
|
||||
command.cooker.configuration.data.setVar(varname, value)
|
||||
|
||||
def enableDataTracking(self, command, params):
|
||||
"""
|
||||
Enable history tracking for variables
|
||||
"""
|
||||
command.cooker.enableDataTracking()
|
||||
|
||||
def disableDataTracking(self, command, params):
|
||||
"""
|
||||
Disable history tracking for variables
|
||||
"""
|
||||
command.cooker.disableDataTracking()
|
||||
|
||||
def initCooker(self, command, params):
|
||||
"""
|
||||
Init the cooker to initial state with nothing parsed
|
||||
@@ -205,21 +179,12 @@ class CommandsSync:
|
||||
"""
|
||||
return bb.utils.cpu_count()
|
||||
|
||||
def matchFile(self, command, params):
|
||||
fMatch = params[0]
|
||||
return command.cooker.matchFile(fMatch)
|
||||
|
||||
def generateNewImage(self, command, params):
|
||||
image = params[0]
|
||||
base_image = params[1]
|
||||
package_queue = params[2]
|
||||
return command.cooker.generateNewImage(image, base_image, package_queue)
|
||||
|
||||
def setVarFile(self, command, params):
|
||||
var = params[0]
|
||||
val = params[1]
|
||||
default_file = params[2]
|
||||
command.cooker.saveConfigurationVar(var, val, default_file)
|
||||
def setConfFilter(self, command, params):
|
||||
"""
|
||||
Set the configuration file parsing filter
|
||||
"""
|
||||
filterfunc = params[0]
|
||||
bb.parse.parse_py.ConfHandler.confFilters.append(filterfunc)
|
||||
|
||||
class CommandsAsync:
|
||||
"""
|
||||
|
||||
@@ -239,690 +239,3 @@ class OrderedDict(dict):
|
||||
def viewitems(self):
|
||||
"od.viewitems() -> a set-like object providing a view on od's items"
|
||||
return ItemsView(self)
|
||||
|
||||
# Multiprocessing pool code imported from python 2.7.3. Previous versions of
|
||||
# python have issues in this code which hang pool usage
|
||||
|
||||
#
|
||||
# Module providing the `Pool` class for managing a process pool
|
||||
#
|
||||
# multiprocessing/pool.py
|
||||
#
|
||||
# Copyright (c) 2006-2008, R Oudkerk
|
||||
# All rights reserved.
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions
|
||||
# are met:
|
||||
#
|
||||
# 1. Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# 2. Redistributions in binary form must reproduce the above copyright
|
||||
# notice, this list of conditions and the following disclaimer in the
|
||||
# documentation and/or other materials provided with the distribution.
|
||||
# 3. Neither the name of author nor the names of any contributors may be
|
||||
# used to endorse or promote products derived from this software
|
||||
# without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS "AS IS" AND
|
||||
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||
# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
|
||||
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
||||
# OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
||||
# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
||||
# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
||||
# SUCH DAMAGE.
|
||||
#
|
||||
import threading
|
||||
import Queue
|
||||
import itertools
|
||||
import collections
|
||||
import time
|
||||
|
||||
import multiprocessing
|
||||
from multiprocessing import Process, cpu_count, TimeoutError, pool
|
||||
from multiprocessing.util import Finalize, debug
|
||||
|
||||
#
|
||||
# Constants representing the state of a pool
|
||||
#
|
||||
|
||||
RUN = 0
|
||||
CLOSE = 1
|
||||
TERMINATE = 2
|
||||
|
||||
#
|
||||
# Miscellaneous
|
||||
#
|
||||
|
||||
def mapstar(args):
|
||||
return map(*args)
|
||||
|
||||
class MaybeEncodingError(Exception):
|
||||
"""Wraps possible unpickleable errors, so they can be
|
||||
safely sent through the socket."""
|
||||
|
||||
def __init__(self, exc, value):
|
||||
self.exc = repr(exc)
|
||||
self.value = repr(value)
|
||||
super(MaybeEncodingError, self).__init__(self.exc, self.value)
|
||||
|
||||
def __str__(self):
|
||||
return "Error sending result: '%s'. Reason: '%s'" % (self.value,
|
||||
self.exc)
|
||||
|
||||
def __repr__(self):
|
||||
return "<MaybeEncodingError: %s>" % str(self)
|
||||
|
||||
def worker(inqueue, outqueue, initializer=None, initargs=(), maxtasks=None):
|
||||
assert maxtasks is None or (type(maxtasks) == int and maxtasks > 0)
|
||||
put = outqueue.put
|
||||
get = inqueue.get
|
||||
if hasattr(inqueue, '_writer'):
|
||||
inqueue._writer.close()
|
||||
outqueue._reader.close()
|
||||
|
||||
if initializer is not None:
|
||||
initializer(*initargs)
|
||||
|
||||
completed = 0
|
||||
while maxtasks is None or (maxtasks and completed < maxtasks):
|
||||
try:
|
||||
task = get()
|
||||
except (EOFError, IOError):
|
||||
debug('worker got EOFError or IOError -- exiting')
|
||||
break
|
||||
|
||||
if task is None:
|
||||
debug('worker got sentinel -- exiting')
|
||||
break
|
||||
|
||||
job, i, func, args, kwds = task
|
||||
try:
|
||||
result = (True, func(*args, **kwds))
|
||||
except Exception, e:
|
||||
result = (False, e)
|
||||
try:
|
||||
put((job, i, result))
|
||||
except Exception as e:
|
||||
wrapped = MaybeEncodingError(e, result[1])
|
||||
debug("Possible encoding error while sending result: %s" % (
|
||||
wrapped))
|
||||
put((job, i, (False, wrapped)))
|
||||
completed += 1
|
||||
debug('worker exiting after %d tasks' % completed)
|
||||
|
||||
|
||||
class Pool(object):
|
||||
'''
|
||||
Class which supports an async version of the `apply()` builtin
|
||||
'''
|
||||
Process = Process
|
||||
|
||||
def __init__(self, processes=None, initializer=None, initargs=(),
|
||||
maxtasksperchild=None):
|
||||
self._setup_queues()
|
||||
self._taskqueue = Queue.Queue()
|
||||
self._cache = {}
|
||||
self._state = RUN
|
||||
self._maxtasksperchild = maxtasksperchild
|
||||
self._initializer = initializer
|
||||
self._initargs = initargs
|
||||
|
||||
if processes is None:
|
||||
try:
|
||||
processes = cpu_count()
|
||||
except NotImplementedError:
|
||||
processes = 1
|
||||
if processes < 1:
|
||||
raise ValueError("Number of processes must be at least 1")
|
||||
|
||||
if initializer is not None and not hasattr(initializer, '__call__'):
|
||||
raise TypeError('initializer must be a callable')
|
||||
|
||||
self._processes = processes
|
||||
self._pool = []
|
||||
self._repopulate_pool()
|
||||
|
||||
self._worker_handler = threading.Thread(
|
||||
target=Pool._handle_workers,
|
||||
args=(self, )
|
||||
)
|
||||
self._worker_handler.daemon = True
|
||||
self._worker_handler._state = RUN
|
||||
self._worker_handler.start()
|
||||
|
||||
|
||||
self._task_handler = threading.Thread(
|
||||
target=Pool._handle_tasks,
|
||||
args=(self._taskqueue, self._quick_put, self._outqueue, self._pool)
|
||||
)
|
||||
self._task_handler.daemon = True
|
||||
self._task_handler._state = RUN
|
||||
self._task_handler.start()
|
||||
|
||||
self._result_handler = threading.Thread(
|
||||
target=Pool._handle_results,
|
||||
args=(self._outqueue, self._quick_get, self._cache)
|
||||
)
|
||||
self._result_handler.daemon = True
|
||||
self._result_handler._state = RUN
|
||||
self._result_handler.start()
|
||||
|
||||
self._terminate = Finalize(
|
||||
self, self._terminate_pool,
|
||||
args=(self._taskqueue, self._inqueue, self._outqueue, self._pool,
|
||||
self._worker_handler, self._task_handler,
|
||||
self._result_handler, self._cache),
|
||||
exitpriority=15
|
||||
)
|
||||
|
||||
def _join_exited_workers(self):
|
||||
"""Cleanup after any worker processes which have exited due to reaching
|
||||
their specified lifetime. Returns True if any workers were cleaned up.
|
||||
"""
|
||||
cleaned = False
|
||||
for i in reversed(range(len(self._pool))):
|
||||
worker = self._pool[i]
|
||||
if worker.exitcode is not None:
|
||||
# worker exited
|
||||
debug('cleaning up worker %d' % i)
|
||||
worker.join()
|
||||
cleaned = True
|
||||
del self._pool[i]
|
||||
return cleaned
|
||||
|
||||
def _repopulate_pool(self):
|
||||
"""Bring the number of pool processes up to the specified number,
|
||||
for use after reaping workers which have exited.
|
||||
"""
|
||||
for i in range(self._processes - len(self._pool)):
|
||||
w = self.Process(target=worker,
|
||||
args=(self._inqueue, self._outqueue,
|
||||
self._initializer,
|
||||
self._initargs, self._maxtasksperchild)
|
||||
)
|
||||
self._pool.append(w)
|
||||
w.name = w.name.replace('Process', 'PoolWorker')
|
||||
w.daemon = True
|
||||
w.start()
|
||||
debug('added worker')
|
||||
|
||||
def _maintain_pool(self):
|
||||
"""Clean up any exited workers and start replacements for them.
|
||||
"""
|
||||
if self._join_exited_workers():
|
||||
self._repopulate_pool()
|
||||
|
||||
def _setup_queues(self):
|
||||
from multiprocessing.queues import SimpleQueue
|
||||
self._inqueue = SimpleQueue()
|
||||
self._outqueue = SimpleQueue()
|
||||
self._quick_put = self._inqueue._writer.send
|
||||
self._quick_get = self._outqueue._reader.recv
|
||||
|
||||
def apply(self, func, args=(), kwds={}):
|
||||
'''
|
||||
Equivalent of `apply()` builtin
|
||||
'''
|
||||
assert self._state == RUN
|
||||
return self.apply_async(func, args, kwds).get()
|
||||
|
||||
def map(self, func, iterable, chunksize=None):
|
||||
'''
|
||||
Equivalent of `map()` builtin
|
||||
'''
|
||||
assert self._state == RUN
|
||||
return self.map_async(func, iterable, chunksize).get()
|
||||
|
||||
def imap(self, func, iterable, chunksize=1):
|
||||
'''
|
||||
Equivalent of `itertools.imap()` -- can be MUCH slower than `Pool.map()`
|
||||
'''
|
||||
assert self._state == RUN
|
||||
if chunksize == 1:
|
||||
result = IMapIterator(self._cache)
|
||||
self._taskqueue.put((((result._job, i, func, (x,), {})
|
||||
for i, x in enumerate(iterable)), result._set_length))
|
||||
return result
|
||||
else:
|
||||
assert chunksize > 1
|
||||
task_batches = Pool._get_tasks(func, iterable, chunksize)
|
||||
result = IMapIterator(self._cache)
|
||||
self._taskqueue.put((((result._job, i, mapstar, (x,), {})
|
||||
for i, x in enumerate(task_batches)), result._set_length))
|
||||
return (item for chunk in result for item in chunk)
|
||||
|
||||
def imap_unordered(self, func, iterable, chunksize=1):
|
||||
'''
|
||||
Like `imap()` method but ordering of results is arbitrary
|
||||
'''
|
||||
assert self._state == RUN
|
||||
if chunksize == 1:
|
||||
result = IMapUnorderedIterator(self._cache)
|
||||
self._taskqueue.put((((result._job, i, func, (x,), {})
|
||||
for i, x in enumerate(iterable)), result._set_length))
|
||||
return result
|
||||
else:
|
||||
assert chunksize > 1
|
||||
task_batches = Pool._get_tasks(func, iterable, chunksize)
|
||||
result = IMapUnorderedIterator(self._cache)
|
||||
self._taskqueue.put((((result._job, i, mapstar, (x,), {})
|
||||
for i, x in enumerate(task_batches)), result._set_length))
|
||||
return (item for chunk in result for item in chunk)
|
||||
|
||||
def apply_async(self, func, args=(), kwds={}, callback=None):
|
||||
'''
|
||||
Asynchronous equivalent of `apply()` builtin
|
||||
'''
|
||||
assert self._state == RUN
|
||||
result = ApplyResult(self._cache, callback)
|
||||
self._taskqueue.put(([(result._job, None, func, args, kwds)], None))
|
||||
return result
|
||||
|
||||
def map_async(self, func, iterable, chunksize=None, callback=None):
|
||||
'''
|
||||
Asynchronous equivalent of `map()` builtin
|
||||
'''
|
||||
assert self._state == RUN
|
||||
if not hasattr(iterable, '__len__'):
|
||||
iterable = list(iterable)
|
||||
|
||||
if chunksize is None:
|
||||
chunksize, extra = divmod(len(iterable), len(self._pool) * 4)
|
||||
if extra:
|
||||
chunksize += 1
|
||||
if len(iterable) == 0:
|
||||
chunksize = 0
|
||||
|
||||
task_batches = Pool._get_tasks(func, iterable, chunksize)
|
||||
result = MapResult(self._cache, chunksize, len(iterable), callback)
|
||||
self._taskqueue.put((((result._job, i, mapstar, (x,), {})
|
||||
for i, x in enumerate(task_batches)), None))
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _handle_workers(pool):
|
||||
thread = threading.current_thread()
|
||||
|
||||
# Keep maintaining workers until the cache gets drained, unless the pool
|
||||
# is terminated.
|
||||
while thread._state == RUN or (pool._cache and thread._state != TERMINATE):
|
||||
pool._maintain_pool()
|
||||
time.sleep(0.1)
|
||||
# send sentinel to stop workers
|
||||
pool._taskqueue.put(None)
|
||||
debug('worker handler exiting')
|
||||
|
||||
@staticmethod
|
||||
def _handle_tasks(taskqueue, put, outqueue, pool):
|
||||
thread = threading.current_thread()
|
||||
|
||||
for taskseq, set_length in iter(taskqueue.get, None):
|
||||
i = -1
|
||||
for i, task in enumerate(taskseq):
|
||||
if thread._state:
|
||||
debug('task handler found thread._state != RUN')
|
||||
break
|
||||
try:
|
||||
put(task)
|
||||
except IOError:
|
||||
debug('could not put task on queue')
|
||||
break
|
||||
else:
|
||||
if set_length:
|
||||
debug('doing set_length()')
|
||||
set_length(i+1)
|
||||
continue
|
||||
break
|
||||
else:
|
||||
debug('task handler got sentinel')
|
||||
|
||||
|
||||
try:
|
||||
# tell result handler to finish when cache is empty
|
||||
debug('task handler sending sentinel to result handler')
|
||||
outqueue.put(None)
|
||||
|
||||
# tell workers there is no more work
|
||||
debug('task handler sending sentinel to workers')
|
||||
for p in pool:
|
||||
put(None)
|
||||
except IOError:
|
||||
debug('task handler got IOError when sending sentinels')
|
||||
|
||||
debug('task handler exiting')
|
||||
|
||||
@staticmethod
|
||||
def _handle_results(outqueue, get, cache):
|
||||
thread = threading.current_thread()
|
||||
|
||||
while 1:
|
||||
try:
|
||||
task = get()
|
||||
except (IOError, EOFError):
|
||||
debug('result handler got EOFError/IOError -- exiting')
|
||||
return
|
||||
|
||||
if thread._state:
|
||||
assert thread._state == TERMINATE
|
||||
debug('result handler found thread._state=TERMINATE')
|
||||
break
|
||||
|
||||
if task is None:
|
||||
debug('result handler got sentinel')
|
||||
break
|
||||
|
||||
job, i, obj = task
|
||||
try:
|
||||
cache[job]._set(i, obj)
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
while cache and thread._state != TERMINATE:
|
||||
try:
|
||||
task = get()
|
||||
except (IOError, EOFError):
|
||||
debug('result handler got EOFError/IOError -- exiting')
|
||||
return
|
||||
|
||||
if task is None:
|
||||
debug('result handler ignoring extra sentinel')
|
||||
continue
|
||||
job, i, obj = task
|
||||
try:
|
||||
cache[job]._set(i, obj)
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if hasattr(outqueue, '_reader'):
|
||||
debug('ensuring that outqueue is not full')
|
||||
# If we don't make room available in outqueue then
|
||||
# attempts to add the sentinel (None) to outqueue may
|
||||
# block. There is guaranteed to be no more than 2 sentinels.
|
||||
try:
|
||||
for i in range(10):
|
||||
if not outqueue._reader.poll():
|
||||
break
|
||||
get()
|
||||
except (IOError, EOFError):
|
||||
pass
|
||||
|
||||
debug('result handler exiting: len(cache)=%s, thread._state=%s',
|
||||
len(cache), thread._state)
|
||||
|
||||
@staticmethod
|
||||
def _get_tasks(func, it, size):
|
||||
it = iter(it)
|
||||
while 1:
|
||||
x = tuple(itertools.islice(it, size))
|
||||
if not x:
|
||||
return
|
||||
yield (func, x)
|
||||
|
||||
def __reduce__(self):
|
||||
raise NotImplementedError(
|
||||
'pool objects cannot be passed between processes or pickled'
|
||||
)
|
||||
|
||||
def close(self):
|
||||
debug('closing pool')
|
||||
if self._state == RUN:
|
||||
self._state = CLOSE
|
||||
self._worker_handler._state = CLOSE
|
||||
|
||||
def terminate(self):
|
||||
debug('terminating pool')
|
||||
self._state = TERMINATE
|
||||
self._worker_handler._state = TERMINATE
|
||||
self._terminate()
|
||||
|
||||
def join(self):
|
||||
debug('joining pool')
|
||||
assert self._state in (CLOSE, TERMINATE)
|
||||
self._worker_handler.join()
|
||||
self._task_handler.join()
|
||||
self._result_handler.join()
|
||||
for p in self._pool:
|
||||
p.join()
|
||||
|
||||
@staticmethod
|
||||
def _help_stuff_finish(inqueue, task_handler, size):
|
||||
# task_handler may be blocked trying to put items on inqueue
|
||||
debug('removing tasks from inqueue until task handler finished')
|
||||
inqueue._rlock.acquire()
|
||||
while task_handler.is_alive() and inqueue._reader.poll():
|
||||
inqueue._reader.recv()
|
||||
time.sleep(0)
|
||||
|
||||
@classmethod
|
||||
def _terminate_pool(cls, taskqueue, inqueue, outqueue, pool,
|
||||
worker_handler, task_handler, result_handler, cache):
|
||||
# this is guaranteed to only be called once
|
||||
debug('finalizing pool')
|
||||
|
||||
worker_handler._state = TERMINATE
|
||||
task_handler._state = TERMINATE
|
||||
|
||||
debug('helping task handler/workers to finish')
|
||||
cls._help_stuff_finish(inqueue, task_handler, len(pool))
|
||||
|
||||
assert result_handler.is_alive() or len(cache) == 0
|
||||
|
||||
result_handler._state = TERMINATE
|
||||
outqueue.put(None) # sentinel
|
||||
|
||||
# We must wait for the worker handler to exit before terminating
|
||||
# workers because we don't want workers to be restarted behind our back.
|
||||
debug('joining worker handler')
|
||||
if threading.current_thread() is not worker_handler:
|
||||
worker_handler.join(1e100)
|
||||
|
||||
# Terminate workers which haven't already finished.
|
||||
if pool and hasattr(pool[0], 'terminate'):
|
||||
debug('terminating workers')
|
||||
for p in pool:
|
||||
if p.exitcode is None:
|
||||
p.terminate()
|
||||
|
||||
debug('joining task handler')
|
||||
if threading.current_thread() is not task_handler:
|
||||
task_handler.join(1e100)
|
||||
|
||||
debug('joining result handler')
|
||||
if threading.current_thread() is not result_handler:
|
||||
result_handler.join(1e100)
|
||||
|
||||
if pool and hasattr(pool[0], 'terminate'):
|
||||
debug('joining pool workers')
|
||||
for p in pool:
|
||||
if p.is_alive():
|
||||
# worker has not yet exited
|
||||
debug('cleaning up worker %d' % p.pid)
|
||||
p.join()
|
||||
|
||||
class ApplyResult(object):
|
||||
|
||||
def __init__(self, cache, callback):
|
||||
self._cond = threading.Condition(threading.Lock())
|
||||
self._job = multiprocessing.pool.job_counter.next()
|
||||
self._cache = cache
|
||||
self._ready = False
|
||||
self._callback = callback
|
||||
cache[self._job] = self
|
||||
|
||||
def ready(self):
|
||||
return self._ready
|
||||
|
||||
def successful(self):
|
||||
assert self._ready
|
||||
return self._success
|
||||
|
||||
def wait(self, timeout=None):
|
||||
self._cond.acquire()
|
||||
try:
|
||||
if not self._ready:
|
||||
self._cond.wait(timeout)
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
def get(self, timeout=None):
|
||||
self.wait(timeout)
|
||||
if not self._ready:
|
||||
raise TimeoutError
|
||||
if self._success:
|
||||
return self._value
|
||||
else:
|
||||
raise self._value
|
||||
|
||||
def _set(self, i, obj):
|
||||
self._success, self._value = obj
|
||||
if self._callback and self._success:
|
||||
self._callback(self._value)
|
||||
self._cond.acquire()
|
||||
try:
|
||||
self._ready = True
|
||||
self._cond.notify()
|
||||
finally:
|
||||
self._cond.release()
|
||||
del self._cache[self._job]
|
||||
|
||||
#
|
||||
# Class whose instances are returned by `Pool.map_async()`
|
||||
#
|
||||
|
||||
class MapResult(ApplyResult):
|
||||
|
||||
def __init__(self, cache, chunksize, length, callback):
|
||||
ApplyResult.__init__(self, cache, callback)
|
||||
self._success = True
|
||||
self._value = [None] * length
|
||||
self._chunksize = chunksize
|
||||
if chunksize <= 0:
|
||||
self._number_left = 0
|
||||
self._ready = True
|
||||
del cache[self._job]
|
||||
else:
|
||||
self._number_left = length//chunksize + bool(length % chunksize)
|
||||
|
||||
def _set(self, i, success_result):
|
||||
success, result = success_result
|
||||
if success:
|
||||
self._value[i*self._chunksize:(i+1)*self._chunksize] = result
|
||||
self._number_left -= 1
|
||||
if self._number_left == 0:
|
||||
if self._callback:
|
||||
self._callback(self._value)
|
||||
del self._cache[self._job]
|
||||
self._cond.acquire()
|
||||
try:
|
||||
self._ready = True
|
||||
self._cond.notify()
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
else:
|
||||
self._success = False
|
||||
self._value = result
|
||||
del self._cache[self._job]
|
||||
self._cond.acquire()
|
||||
try:
|
||||
self._ready = True
|
||||
self._cond.notify()
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
#
|
||||
# Class whose instances are returned by `Pool.imap()`
|
||||
#
|
||||
|
||||
class IMapIterator(object):
|
||||
|
||||
def __init__(self, cache):
|
||||
self._cond = threading.Condition(threading.Lock())
|
||||
self._job = multiprocessing.pool.job_counter.next()
|
||||
self._cache = cache
|
||||
self._items = collections.deque()
|
||||
self._index = 0
|
||||
self._length = None
|
||||
self._unsorted = {}
|
||||
cache[self._job] = self
|
||||
|
||||
def __iter__(self):
|
||||
return self
|
||||
|
||||
def next(self, timeout=None):
|
||||
self._cond.acquire()
|
||||
try:
|
||||
try:
|
||||
item = self._items.popleft()
|
||||
except IndexError:
|
||||
if self._index == self._length:
|
||||
raise StopIteration
|
||||
self._cond.wait(timeout)
|
||||
try:
|
||||
item = self._items.popleft()
|
||||
except IndexError:
|
||||
if self._index == self._length:
|
||||
raise StopIteration
|
||||
raise TimeoutError
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
success, value = item
|
||||
if success:
|
||||
return value
|
||||
raise value
|
||||
|
||||
__next__ = next # XXX
|
||||
|
||||
def _set(self, i, obj):
|
||||
self._cond.acquire()
|
||||
try:
|
||||
if self._index == i:
|
||||
self._items.append(obj)
|
||||
self._index += 1
|
||||
while self._index in self._unsorted:
|
||||
obj = self._unsorted.pop(self._index)
|
||||
self._items.append(obj)
|
||||
self._index += 1
|
||||
self._cond.notify()
|
||||
else:
|
||||
self._unsorted[i] = obj
|
||||
|
||||
if self._index == self._length:
|
||||
del self._cache[self._job]
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
def _set_length(self, length):
|
||||
self._cond.acquire()
|
||||
try:
|
||||
self._length = length
|
||||
if self._index == self._length:
|
||||
self._cond.notify()
|
||||
del self._cache[self._job]
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
#
|
||||
# Class whose instances are returned by `Pool.imap_unordered()`
|
||||
#
|
||||
|
||||
class IMapUnorderedIterator(IMapIterator):
|
||||
|
||||
def _set(self, i, obj):
|
||||
self._cond.acquire()
|
||||
try:
|
||||
self._items.append(obj)
|
||||
self._index += 1
|
||||
self._cond.notify()
|
||||
if self._index == self._length:
|
||||
del self._cache[self._job]
|
||||
finally:
|
||||
self._cond.release()
|
||||
|
||||
|
||||
|
||||
@@ -177,24 +177,21 @@ class BBCooker:
|
||||
|
||||
def initConfigurationData(self):
|
||||
self.configuration.data = bb.data.init()
|
||||
if self.configuration.show_environment:
|
||||
self.configuration.data.enableTracking()
|
||||
|
||||
if not self.server_registration_cb:
|
||||
self.configuration.data.setVar("BB_WORKERCONTEXT", "1")
|
||||
|
||||
filtered_keys = bb.utils.approved_variables()
|
||||
bb.data.inheritFromOS(self.configuration.data, self.savedenv, filtered_keys)
|
||||
self.configuration.data.setVar("BB_ORIGENV", self.savedenv)
|
||||
|
||||
def enableDataTracking(self):
|
||||
self.configuration.data.enableTracking()
|
||||
|
||||
def disableDataTracking(self):
|
||||
self.configuration.data.disableTracking()
|
||||
|
||||
def loadConfigurationData(self):
|
||||
self.initConfigurationData()
|
||||
self.configuration.data = bb.data.init()
|
||||
|
||||
if not self.server_registration_cb:
|
||||
self.configuration.data.setVar("BB_WORKERCONTEXT", "1")
|
||||
|
||||
filtered_keys = bb.utils.approved_variables()
|
||||
bb.data.inheritFromOS(self.configuration.data, self.savedenv, filtered_keys)
|
||||
|
||||
try:
|
||||
self.parseConfigurationFiles(self.configuration.prefile,
|
||||
@@ -208,89 +205,6 @@ class BBCooker:
|
||||
if not self.configuration.cmd:
|
||||
self.configuration.cmd = self.configuration.data.getVar("BB_DEFAULT_TASK", True) or "build"
|
||||
|
||||
def saveConfigurationVar(self, var, val, default_file):
|
||||
|
||||
replaced = False
|
||||
#do not save if nothing changed
|
||||
if str(val) == self.configuration.data.getVar(var):
|
||||
return
|
||||
|
||||
conf_files = self.configuration.data.varhistory.get_variable_files(var)
|
||||
|
||||
#format the value when it is a list
|
||||
if isinstance(val, list):
|
||||
listval = ""
|
||||
for value in val:
|
||||
listval += "%s " % value
|
||||
val = listval
|
||||
|
||||
topdir = self.configuration.data.getVar("TOPDIR")
|
||||
|
||||
#comment or replace operations made on var
|
||||
for conf_file in conf_files:
|
||||
if topdir in conf_file:
|
||||
with open(conf_file, 'r') as f:
|
||||
contents = f.readlines()
|
||||
f.close()
|
||||
|
||||
lines = self.configuration.data.varhistory.get_variable_lines(var, conf_file)
|
||||
for line in lines:
|
||||
total = ""
|
||||
i = 0
|
||||
for c in contents:
|
||||
total += c
|
||||
i = i + 1
|
||||
if i==int(line):
|
||||
end_index = len(total)
|
||||
index = total.rfind(var, 0, end_index)
|
||||
|
||||
begin_line = total.count("\n",0,index)
|
||||
end_line = int(line)
|
||||
|
||||
#check if the variable was saved before in the same way
|
||||
#if true it replace the place where the variable was declared
|
||||
#else it comments it
|
||||
if contents[begin_line-1]== "#added by bitbake\n":
|
||||
contents[begin_line] = "%s = \"%s\"\n" % (var, val)
|
||||
replaced = True
|
||||
else:
|
||||
for ii in range(begin_line, end_line):
|
||||
contents[ii] = "#" + contents[ii]
|
||||
|
||||
total = ""
|
||||
for c in contents:
|
||||
total += c
|
||||
with open(conf_file, 'w') as f:
|
||||
f.write(total)
|
||||
f.close()
|
||||
|
||||
if replaced == False:
|
||||
#remove var from history
|
||||
self.configuration.data.varhistory.del_var_history(var)
|
||||
|
||||
#add var to the end of default_file
|
||||
default_file = self._findConfigFile(default_file)
|
||||
|
||||
with open(default_file, 'r') as f:
|
||||
contents = f.readlines()
|
||||
f.close()
|
||||
|
||||
total = ""
|
||||
for c in contents:
|
||||
total += c
|
||||
|
||||
#add the variable on a single line, to be easy to replace the second time
|
||||
total += "#added by bitbake"
|
||||
total += "\n%s = \"%s\"\n" % (var, val)
|
||||
|
||||
with open(default_file, 'w') as f:
|
||||
f.write(total)
|
||||
f.close()
|
||||
|
||||
#add to history
|
||||
loginfo = {"op":set, "file":default_file, "line":total.count("\n")}
|
||||
self.configuration.data.setVar(var, val, **loginfo)
|
||||
|
||||
def parseConfiguration(self):
|
||||
|
||||
# Set log file verbosity
|
||||
@@ -325,10 +239,8 @@ class BBCooker:
|
||||
self.commandlineAction['msg'] = "No target should be used with the --environment and --buildfile options."
|
||||
elif len(self.configuration.pkgs_to_build) > 0:
|
||||
self.commandlineAction['action'] = ["showEnvironmentTarget", self.configuration.pkgs_to_build]
|
||||
self.configuration.data.setVar("BB_CONSOLELOG", None)
|
||||
else:
|
||||
self.commandlineAction['action'] = ["showEnvironment", self.configuration.buildfile]
|
||||
self.configuration.data.setVar("BB_CONSOLELOG", None)
|
||||
elif self.configuration.buildfile is not None:
|
||||
self.commandlineAction['action'] = ["buildFile", self.configuration.buildfile, self.configuration.cmd]
|
||||
elif self.configuration.revisions_changed:
|
||||
@@ -399,10 +311,6 @@ class BBCooker:
|
||||
elif len(pkgs_to_build) == 1:
|
||||
self.updateCache()
|
||||
|
||||
ignore = self.configuration.data.getVar("ASSUME_PROVIDED", True) or ""
|
||||
if pkgs_to_build[0] in set(ignore.split()):
|
||||
bb.fatal("%s is in ASSUME_PROVIDED" % pkgs_to_build[0])
|
||||
|
||||
localdata = data.createCopy(self.configuration.data)
|
||||
bb.data.update_data(localdata)
|
||||
bb.data.expandKeys(localdata)
|
||||
@@ -424,11 +332,6 @@ class BBCooker:
|
||||
parselog.exception("Unable to read %s", fn)
|
||||
raise
|
||||
|
||||
# Display history
|
||||
with closing(StringIO()) as env:
|
||||
self.configuration.data.inchistory.emit(env)
|
||||
logger.plain(env.getvalue())
|
||||
|
||||
# emit variables and shell functions
|
||||
data.update_data(envdata)
|
||||
with closing(StringIO()) as env:
|
||||
@@ -471,8 +374,6 @@ class BBCooker:
|
||||
taskdata.add_unresolved(localdata, self.status)
|
||||
bb.event.fire(bb.event.TreeDataPreparationCompleted(len(pkgs_to_build)), self.configuration.data)
|
||||
return runlist, taskdata
|
||||
|
||||
######## WARNING : this function requires cache_extra to be enabled ########
|
||||
|
||||
def generateTaskDepTreeData(self, pkgs_to_build, task):
|
||||
"""
|
||||
@@ -546,7 +447,6 @@ class BBCooker:
|
||||
|
||||
return depend_tree
|
||||
|
||||
######## WARNING : this function requires cache_extra to be enabled ########
|
||||
def generatePkgDepTreeData(self, pkgs_to_build, task):
|
||||
"""
|
||||
Create a dependency tree of pkgs_to_build, returning the data.
|
||||
@@ -574,12 +474,8 @@ class BBCooker:
|
||||
lic = self.status.license[fn]
|
||||
section = self.status.section[fn]
|
||||
description = self.status.description[fn]
|
||||
homepage = self.status.homepage[fn]
|
||||
bugtracker = self.status.bugtracker[fn]
|
||||
files_info = self.status.files_info[fn]
|
||||
rdepends = self.status.rundeps[fn]
|
||||
rrecs = self.status.runrecs[fn]
|
||||
prevision = self.status.prevision[fn]
|
||||
inherits = self.status.inherits.get(fn, None)
|
||||
if pn not in depend_tree["pn"]:
|
||||
depend_tree["pn"][pn] = {}
|
||||
@@ -590,10 +486,6 @@ class BBCooker:
|
||||
depend_tree["pn"][pn]["section"] = section
|
||||
depend_tree["pn"][pn]["description"] = description
|
||||
depend_tree["pn"][pn]["inherits"] = inherits
|
||||
depend_tree["pn"][pn]["homepage"] = homepage
|
||||
depend_tree["pn"][pn]["bugtracker"] = bugtracker
|
||||
depend_tree["pn"][pn]["files_info"] = files_info
|
||||
depend_tree["pn"][pn]["revision"] = prevision
|
||||
|
||||
if fnid not in seen_fnids:
|
||||
seen_fnids.append(fnid)
|
||||
@@ -798,8 +690,8 @@ class BBCooker:
|
||||
# Generate a list of parsed configuration files by searching the files
|
||||
# listed in the __depends and __base_depends variables with a .conf suffix.
|
||||
conffiles = []
|
||||
dep_files = self.configuration.data.getVar('__base_depends') or []
|
||||
dep_files = dep_files + (self.configuration.data.getVar('__depends') or [])
|
||||
dep_files = self.configuration.data.getVar('__depends') or set()
|
||||
dep_files.union(self.configuration.data.getVar('__base_depends') or set())
|
||||
|
||||
for f in dep_files:
|
||||
if f[0].endswith(".conf"):
|
||||
@@ -988,16 +880,10 @@ class BBCooker:
|
||||
bb.fetch.fetcher_init(data)
|
||||
bb.codeparser.parser_cache_init(data)
|
||||
bb.event.fire(bb.event.ConfigParsed(), data)
|
||||
|
||||
if data.getVar("BB_INVALIDCONF") is True:
|
||||
data.setVar("BB_INVALIDCONF", False)
|
||||
self.parseConfigurationFiles(self.configuration.prefile,
|
||||
self.configuration.postfile)
|
||||
else:
|
||||
bb.parse.init_parser(data)
|
||||
data.setVar('BBINCLUDED',bb.parse.get_file_depends(data))
|
||||
self.configuration.data = data
|
||||
self.configuration.data_hash = data.get_hash()
|
||||
bb.parse.init_parser(data)
|
||||
data.setVar('BBINCLUDED',bb.parse.get_file_depends(data))
|
||||
self.configuration.data = data
|
||||
self.configuration.data_hash = data.get_hash()
|
||||
|
||||
def handleCollections( self, collections ):
|
||||
"""Handle collections"""
|
||||
@@ -1296,25 +1182,6 @@ class BBCooker:
|
||||
|
||||
self.server_registration_cb(buildTargetsIdle, rq)
|
||||
|
||||
def generateNewImage(self, image, base_image, package_queue):
|
||||
'''
|
||||
Create a new image with a "require" base_image statement
|
||||
'''
|
||||
image_name = os.path.splitext(image)[0]
|
||||
timestr = time.strftime("-%Y%m%d-%H%M%S")
|
||||
dest = image_name + str(timestr) + ".bb"
|
||||
|
||||
with open(dest, "w") as imagefile:
|
||||
imagefile.write("require " + base_image + "\n")
|
||||
package_install = "PACKAGE_INSTALL_forcevariable = \""
|
||||
for package in package_queue:
|
||||
package_install += str(package) + " "
|
||||
package_install += "\"\n"
|
||||
imagefile.write(package_install)
|
||||
|
||||
self.state = state.initial
|
||||
return timestr
|
||||
|
||||
def updateCache(self):
|
||||
if self.state == state.running:
|
||||
return
|
||||
@@ -1486,10 +1353,7 @@ class BBCooker:
|
||||
# Empty the environment. The environment will be populated as
|
||||
# necessary from the data store.
|
||||
#bb.utils.empty_environment()
|
||||
try:
|
||||
prserv.serv.auto_start(self.configuration.data)
|
||||
except prserv.serv.PRServiceConfigError:
|
||||
bb.event.fire(CookerExit(), self.configuration.event_data)
|
||||
prserv.serv.auto_start(self.configuration.data)
|
||||
return
|
||||
|
||||
def post_serve(self):
|
||||
@@ -1526,7 +1390,25 @@ def server_main(cooker, func, *args):
|
||||
ret = profile.Profile.runcall(prof, func, *args)
|
||||
|
||||
prof.dump_stats("profile.log")
|
||||
bb.utils.process_profilelog("profile.log")
|
||||
|
||||
# Redirect stdout to capture profile information
|
||||
pout = open('profile.log.processed', 'w')
|
||||
so = sys.stdout.fileno()
|
||||
orig_so = os.dup(sys.stdout.fileno())
|
||||
os.dup2(pout.fileno(), so)
|
||||
|
||||
import pstats
|
||||
p = pstats.Stats('profile.log')
|
||||
p.sort_stats('time')
|
||||
p.print_stats()
|
||||
p.print_callers()
|
||||
p.sort_stats('cumulative')
|
||||
p.print_stats()
|
||||
|
||||
os.dup2(orig_so, so)
|
||||
pout.flush()
|
||||
pout.close()
|
||||
|
||||
print("Raw profiling information saved to profile.log and processed statistics to profile.log.processed")
|
||||
|
||||
else:
|
||||
@@ -1606,7 +1488,6 @@ class Parser(multiprocessing.Process):
|
||||
self.quit = quit
|
||||
self.init = init
|
||||
multiprocessing.Process.__init__(self)
|
||||
self.context = bb.utils._context.copy()
|
||||
|
||||
def run(self):
|
||||
if self.init:
|
||||
@@ -1641,7 +1522,6 @@ class Parser(multiprocessing.Process):
|
||||
|
||||
def parse(self, filename, appends, caches_array):
|
||||
try:
|
||||
bb.utils._context = self.context.copy()
|
||||
return True, bb.cache.Cache.parse(filename, appends, self.cfg, caches_array)
|
||||
except Exception as exc:
|
||||
tb = sys.exc_info()[2]
|
||||
|
||||
@@ -158,12 +158,7 @@ def expandKeys(alterdata, readdata = None):
|
||||
|
||||
for key in todolist:
|
||||
ekey = todolist[key]
|
||||
newval = alterdata.getVar(ekey, 0)
|
||||
if newval:
|
||||
val = alterdata.getVar(key, 0)
|
||||
if val is not None and newval is not None:
|
||||
bb.warn("Variable key %s (%s) replaces original key %s (%s)." % (key, val, ekey, newval))
|
||||
alterdata.renameVar(key, ekey)
|
||||
renameVar(key, ekey, alterdata)
|
||||
|
||||
def inheritFromOS(d, savedenv, permitted):
|
||||
"""Inherit variables from the initial environment."""
|
||||
@@ -171,9 +166,9 @@ def inheritFromOS(d, savedenv, permitted):
|
||||
for s in savedenv.keys():
|
||||
if s in permitted:
|
||||
try:
|
||||
d.setVar(s, getVar(s, savedenv, True), op = 'from env')
|
||||
setVar(s, getVar(s, savedenv, True), d)
|
||||
if s in exportlist:
|
||||
d.setVarFlag(s, "export", True, op = 'auto env export')
|
||||
setVarFlag(s, "export", True, d)
|
||||
except TypeError:
|
||||
pass
|
||||
|
||||
@@ -199,7 +194,8 @@ def emit_var(var, o=sys.__stdout__, d = init(), all=False):
|
||||
return 0
|
||||
|
||||
if all:
|
||||
d.varhistory.emit(var, oval, val, o)
|
||||
commentVal = re.sub('\n', '\n#', str(oval))
|
||||
o.write('# %s=%s\n' % (var, commentVal))
|
||||
|
||||
if (var.find("-") != -1 or var.find(".") != -1 or var.find('{') != -1 or var.find('}') != -1 or var.find('+') != -1) and not all:
|
||||
return 0
|
||||
@@ -264,7 +260,6 @@ def emit_func(func, o=sys.__stdout__, d = init()):
|
||||
|
||||
emit_var(func, o, d, False) and o.write('\n')
|
||||
newdeps = bb.codeparser.ShellParser(func, logger).parse_shell(d.getVar(func, True))
|
||||
newdeps |= set((d.getVarFlag(func, "vardeps", True) or "").split())
|
||||
seen = set()
|
||||
while newdeps:
|
||||
deps = newdeps
|
||||
@@ -278,7 +273,7 @@ def emit_func(func, o=sys.__stdout__, d = init()):
|
||||
|
||||
def update_data(d):
|
||||
"""Performs final steps upon the datastore, including application of overrides"""
|
||||
d.finalize(parent = True)
|
||||
d.finalize()
|
||||
|
||||
def build_dependencies(key, keys, shelldeps, vardepvals, d):
|
||||
deps = set()
|
||||
@@ -362,8 +357,6 @@ def generate_dependencies(d):
|
||||
|
||||
def inherits_class(klass, d):
|
||||
val = getVar('__inherit_cache', d) or []
|
||||
needle = os.path.join('classes', '%s.bbclass' % klass)
|
||||
for v in val:
|
||||
if v.endswith(needle):
|
||||
return True
|
||||
if os.path.join('classes', '%s.bbclass' % klass) in val:
|
||||
return True
|
||||
return False
|
||||
|
||||
@@ -28,7 +28,7 @@ BitBake build tools.
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import copy, re, sys, traceback
|
||||
import copy, re
|
||||
from collections import MutableMapping
|
||||
import logging
|
||||
import hashlib
|
||||
@@ -43,42 +43,6 @@ __setvar_regexp__ = re.compile('(?P<base>.*?)(?P<keyword>_append|_prepend)(_(?P<
|
||||
__expand_var_regexp__ = re.compile(r"\${[^{}]+}")
|
||||
__expand_python_regexp__ = re.compile(r"\${@.+?}")
|
||||
|
||||
def infer_caller_details(loginfo, parent = False, varval = True):
|
||||
"""Save the caller the trouble of specifying everything."""
|
||||
# Save effort.
|
||||
if 'ignore' in loginfo and loginfo['ignore']:
|
||||
return
|
||||
# If nothing was provided, mark this as possibly unneeded.
|
||||
if not loginfo:
|
||||
loginfo['ignore'] = True
|
||||
return
|
||||
# Infer caller's likely values for variable (var) and value (value),
|
||||
# to reduce clutter in the rest of the code.
|
||||
if varval and ('variable' not in loginfo or 'detail' not in loginfo):
|
||||
try:
|
||||
raise Exception
|
||||
except Exception:
|
||||
tb = sys.exc_info()[2]
|
||||
if parent:
|
||||
above = tb.tb_frame.f_back.f_back
|
||||
else:
|
||||
above = tb.tb_frame.f_back
|
||||
lcls = above.f_locals.items()
|
||||
for k, v in lcls:
|
||||
if k == 'value' and 'detail' not in loginfo:
|
||||
loginfo['detail'] = v
|
||||
if k == 'var' and 'variable' not in loginfo:
|
||||
loginfo['variable'] = v
|
||||
# Infer file/line/function from traceback
|
||||
if 'file' not in loginfo:
|
||||
depth = 3
|
||||
if parent:
|
||||
depth = 4
|
||||
file, line, func, text = traceback.extract_stack(limit = depth)[0]
|
||||
loginfo['file'] = file
|
||||
loginfo['line'] = line
|
||||
if func not in loginfo:
|
||||
loginfo['func'] = func
|
||||
|
||||
class VariableParse:
|
||||
def __init__(self, varname, d, val = None):
|
||||
@@ -150,157 +114,16 @@ class ExpansionError(Exception):
|
||||
def __str__(self):
|
||||
return self.msg
|
||||
|
||||
class IncludeHistory(object):
|
||||
def __init__(self, parent = None, filename = '[TOP LEVEL]'):
|
||||
self.parent = parent
|
||||
self.filename = filename
|
||||
self.children = []
|
||||
self.current = self
|
||||
|
||||
def copy(self):
|
||||
new = IncludeHistory(self.parent, self.filename)
|
||||
for c in self.children:
|
||||
new.children.append(c)
|
||||
return new
|
||||
|
||||
def include(self, filename):
|
||||
newfile = IncludeHistory(self.current, filename)
|
||||
self.current.children.append(newfile)
|
||||
self.current = newfile
|
||||
return self
|
||||
|
||||
def __enter__(self):
|
||||
pass
|
||||
|
||||
def __exit__(self, a, b, c):
|
||||
if self.current.parent:
|
||||
self.current = self.current.parent
|
||||
else:
|
||||
bb.warn("Include log: Tried to finish '%s' at top level." % filename)
|
||||
return False
|
||||
|
||||
def emit(self, o, level = 0):
|
||||
"""Emit an include history file, and its children."""
|
||||
if level:
|
||||
spaces = " " * (level - 1)
|
||||
o.write("# %s%s" % (spaces, self.filename))
|
||||
if len(self.children) > 0:
|
||||
o.write(" includes:")
|
||||
else:
|
||||
o.write("#\n# INCLUDE HISTORY:\n#")
|
||||
level = level + 1
|
||||
for child in self.children:
|
||||
o.write("\n")
|
||||
child.emit(o, level)
|
||||
|
||||
class VariableHistory(object):
|
||||
def __init__(self, dataroot):
|
||||
self.dataroot = dataroot
|
||||
self.variables = COWDictBase.copy()
|
||||
|
||||
def copy(self):
|
||||
new = VariableHistory(self.dataroot)
|
||||
new.variables = self.variables.copy()
|
||||
return new
|
||||
|
||||
def record(self, *kwonly, **loginfo):
|
||||
if not self.dataroot._tracking:
|
||||
return
|
||||
if len(kwonly) > 0:
|
||||
raise TypeError
|
||||
infer_caller_details(loginfo, parent = True)
|
||||
if 'ignore' in loginfo and loginfo['ignore']:
|
||||
return
|
||||
if 'op' not in loginfo or not loginfo['op']:
|
||||
loginfo['op'] = 'set'
|
||||
if 'detail' in loginfo:
|
||||
loginfo['detail'] = str(loginfo['detail'])
|
||||
if 'variable' not in loginfo or 'file' not in loginfo:
|
||||
raise ValueError("record() missing variable or file.")
|
||||
var = loginfo['variable']
|
||||
|
||||
if var not in self.variables:
|
||||
self.variables[var] = []
|
||||
self.variables[var].append(loginfo.copy())
|
||||
|
||||
def variable(self, var):
|
||||
if var in self.variables:
|
||||
return self.variables[var]
|
||||
else:
|
||||
return []
|
||||
|
||||
def emit(self, var, oval, val, o):
|
||||
history = self.variable(var)
|
||||
commentVal = re.sub('\n', '\n#', str(oval))
|
||||
if history:
|
||||
if len(history) == 1:
|
||||
o.write("#\n# $%s\n" % var)
|
||||
else:
|
||||
o.write("#\n# $%s [%d operations]\n" % (var, len(history)))
|
||||
for event in history:
|
||||
# o.write("# %s\n" % str(event))
|
||||
if 'func' in event:
|
||||
# If we have a function listed, this is internal
|
||||
# code, not an operation in a config file, and the
|
||||
# full path is distracting.
|
||||
event['file'] = re.sub('.*/', '', event['file'])
|
||||
display_func = ' [%s]' % event['func']
|
||||
else:
|
||||
display_func = ''
|
||||
if 'flag' in event:
|
||||
flag = '[%s] ' % (event['flag'])
|
||||
else:
|
||||
flag = ''
|
||||
o.write("# %s %s:%s%s\n# %s\"%s\"\n" % (event['op'], event['file'], event['line'], display_func, flag, re.sub('\n', '\n# ', event['detail'])))
|
||||
if len(history) > 1:
|
||||
o.write("# computed:\n")
|
||||
o.write('# "%s"\n' % (commentVal))
|
||||
else:
|
||||
o.write("#\n# $%s\n# [no history recorded]\n#\n" % var)
|
||||
o.write('# "%s"\n' % (commentVal))
|
||||
|
||||
def get_variable_files(self, var):
|
||||
"""Get the files where operations are made on a variable"""
|
||||
var_history = self.variable(var)
|
||||
files = []
|
||||
for event in var_history:
|
||||
files.append(event['file'])
|
||||
return files
|
||||
|
||||
def get_variable_lines(self, var, f):
|
||||
"""Get the line where a operation is made on a variable in file f"""
|
||||
var_history = self.variable(var)
|
||||
lines = []
|
||||
for event in var_history:
|
||||
if f== event['file']:
|
||||
line = event['line']
|
||||
lines.append(line)
|
||||
return lines
|
||||
|
||||
def del_var_history(self, var):
|
||||
if var in self.variables:
|
||||
self.variables[var] = []
|
||||
|
||||
class DataSmart(MutableMapping):
|
||||
def __init__(self, special = COWDictBase.copy(), seen = COWDictBase.copy() ):
|
||||
self.dict = {}
|
||||
|
||||
self.inchistory = IncludeHistory()
|
||||
self.varhistory = VariableHistory(self)
|
||||
self._tracking = False
|
||||
|
||||
# cookie monster tribute
|
||||
self._special_values = special
|
||||
self._seen_overrides = seen
|
||||
|
||||
self.expand_cache = {}
|
||||
|
||||
def enableTracking(self):
|
||||
self._tracking = True
|
||||
|
||||
def disableTracking(self):
|
||||
self._tracking = False
|
||||
|
||||
def expandWithRefs(self, s, varname):
|
||||
|
||||
if not isinstance(s, basestring): # sanity check
|
||||
@@ -320,8 +143,6 @@ class DataSmart(MutableMapping):
|
||||
break
|
||||
except ExpansionError:
|
||||
raise
|
||||
except bb.parse.SkipPackage:
|
||||
raise
|
||||
except Exception as exc:
|
||||
raise ExpansionError(varname, s, exc)
|
||||
|
||||
@@ -336,14 +157,10 @@ class DataSmart(MutableMapping):
|
||||
return self.expandWithRefs(s, varname).value
|
||||
|
||||
|
||||
def finalize(self, parent = False):
|
||||
def finalize(self):
|
||||
"""Performs final steps upon the datastore, including application of overrides"""
|
||||
|
||||
overrides = (self.getVar("OVERRIDES", True) or "").split(":") or []
|
||||
finalize_caller = {
|
||||
'op': 'finalize',
|
||||
}
|
||||
infer_caller_details(finalize_caller, parent = parent, varval = False)
|
||||
|
||||
#
|
||||
# Well let us see what breaks here. We used to iterate
|
||||
@@ -360,9 +177,6 @@ class DataSmart(MutableMapping):
|
||||
# Then we will handle _append and _prepend
|
||||
#
|
||||
|
||||
# We only want to report finalization once per variable overridden.
|
||||
finalizes_reported = {}
|
||||
|
||||
for o in overrides:
|
||||
# calculate '_'+override
|
||||
l = len(o) + 1
|
||||
@@ -375,19 +189,7 @@ class DataSmart(MutableMapping):
|
||||
for var in vars:
|
||||
name = var[:-l]
|
||||
try:
|
||||
# Report only once, even if multiple changes.
|
||||
if name not in finalizes_reported:
|
||||
finalizes_reported[name] = True
|
||||
finalize_caller['variable'] = name
|
||||
finalize_caller['detail'] = 'was: ' + str(self.getVar(name, False))
|
||||
self.varhistory.record(**finalize_caller)
|
||||
# Copy history of the override over.
|
||||
for event in self.varhistory.variable(var):
|
||||
loginfo = event.copy()
|
||||
loginfo['variable'] = name
|
||||
loginfo['op'] = 'override[%s]:%s' % (o, loginfo['op'])
|
||||
self.varhistory.record(**loginfo)
|
||||
self.setVar(name, self.getVar(var, False), op = 'finalize', file = 'override[%s]' % o, line = '')
|
||||
self.setVar(name, self.getVar(var, False))
|
||||
self.delVar(var)
|
||||
except Exception:
|
||||
logger.info("Untracked delVar")
|
||||
@@ -418,9 +220,9 @@ class DataSmart(MutableMapping):
|
||||
|
||||
# We save overrides that may be applied at some later stage
|
||||
if keep:
|
||||
self.setVarFlag(append, op, keep, ignore=True)
|
||||
self.setVarFlag(append, op, keep)
|
||||
else:
|
||||
self.delVarFlag(append, op, ignore=True)
|
||||
self.delVarFlag(append, op)
|
||||
|
||||
def initVar(self, var):
|
||||
self.expand_cache = {}
|
||||
@@ -448,11 +250,7 @@ class DataSmart(MutableMapping):
|
||||
else:
|
||||
self.initVar(var)
|
||||
|
||||
|
||||
def setVar(self, var, value, **loginfo):
|
||||
#print("var=" + str(var) + " val=" + str(value))
|
||||
if 'op' not in loginfo:
|
||||
loginfo['op'] = "set"
|
||||
def setVar(self, var, value):
|
||||
self.expand_cache = {}
|
||||
match = __setvar_regexp__.match(var)
|
||||
if match and match.group("keyword") in __setvar_keyword__:
|
||||
@@ -461,22 +259,15 @@ class DataSmart(MutableMapping):
|
||||
override = match.group('add')
|
||||
l = self.getVarFlag(base, keyword) or []
|
||||
l.append([value, override])
|
||||
self.setVarFlag(base, keyword, l, ignore=True)
|
||||
# And cause that to be recorded:
|
||||
loginfo['detail'] = value
|
||||
loginfo['variable'] = base
|
||||
if override:
|
||||
loginfo['op'] = '%s[%s]' % (keyword, override)
|
||||
else:
|
||||
loginfo['op'] = keyword
|
||||
self.varhistory.record(**loginfo)
|
||||
self.setVarFlag(base, keyword, l)
|
||||
|
||||
# todo make sure keyword is not __doc__ or __module__
|
||||
# pay the cookie monster
|
||||
try:
|
||||
self._special_values[keyword].add(base)
|
||||
self._special_values[keyword].add( base )
|
||||
except KeyError:
|
||||
self._special_values[keyword] = set()
|
||||
self._special_values[keyword].add(base)
|
||||
self._special_values[keyword].add( base )
|
||||
|
||||
return
|
||||
|
||||
@@ -493,7 +284,6 @@ class DataSmart(MutableMapping):
|
||||
|
||||
# setting var
|
||||
self.dict[var]["_content"] = value
|
||||
self.varhistory.record(**loginfo)
|
||||
|
||||
def getVar(self, var, expand=False, noweakdefault=False):
|
||||
value = self.getVarFlag(var, "_content", False, noweakdefault)
|
||||
@@ -503,17 +293,13 @@ class DataSmart(MutableMapping):
|
||||
return self.expand(value, var)
|
||||
return value
|
||||
|
||||
def renameVar(self, key, newkey, **loginfo):
|
||||
def renameVar(self, key, newkey):
|
||||
"""
|
||||
Rename the variable key to newkey
|
||||
"""
|
||||
val = self.getVar(key, 0)
|
||||
if val is not None:
|
||||
loginfo['variable'] = newkey
|
||||
loginfo['op'] = 'rename from %s' % key
|
||||
loginfo['detail'] = val
|
||||
self.varhistory.record(**loginfo)
|
||||
self.setVar(newkey, val, ignore=True)
|
||||
self.setVar(newkey, val)
|
||||
|
||||
for i in ('_append', '_prepend'):
|
||||
src = self.getVarFlag(key, i)
|
||||
@@ -522,34 +308,23 @@ class DataSmart(MutableMapping):
|
||||
|
||||
dest = self.getVarFlag(newkey, i) or []
|
||||
dest.extend(src)
|
||||
self.setVarFlag(newkey, i, dest, ignore=True)
|
||||
self.setVarFlag(newkey, i, dest)
|
||||
|
||||
if i in self._special_values and key in self._special_values[i]:
|
||||
self._special_values[i].remove(key)
|
||||
self._special_values[i].add(newkey)
|
||||
|
||||
loginfo['variable'] = key
|
||||
loginfo['op'] = 'rename (to)'
|
||||
loginfo['detail'] = newkey
|
||||
self.varhistory.record(**loginfo)
|
||||
self.delVar(key, ignore=True)
|
||||
self.delVar(key)
|
||||
|
||||
def appendVar(self, var, value, **loginfo):
|
||||
loginfo['op'] = 'append'
|
||||
self.varhistory.record(**loginfo)
|
||||
newvalue = (self.getVar(var, False) or "") + value
|
||||
self.setVar(var, newvalue, ignore=True)
|
||||
def appendVar(self, key, value):
|
||||
value = (self.getVar(key, False) or "") + value
|
||||
self.setVar(key, value)
|
||||
|
||||
def prependVar(self, var, value, **loginfo):
|
||||
loginfo['op'] = 'prepend'
|
||||
self.varhistory.record(**loginfo)
|
||||
newvalue = value + (self.getVar(var, False) or "")
|
||||
self.setVar(var, newvalue, ignore=True)
|
||||
def prependVar(self, key, value):
|
||||
value = value + (self.getVar(key, False) or "")
|
||||
self.setVar(key, value)
|
||||
|
||||
def delVar(self, var, **loginfo):
|
||||
loginfo['detail'] = ""
|
||||
loginfo['op'] = 'del'
|
||||
self.varhistory.record(**loginfo)
|
||||
def delVar(self, var):
|
||||
self.expand_cache = {}
|
||||
self.dict[var] = {}
|
||||
if '_' in var:
|
||||
@@ -557,14 +332,10 @@ class DataSmart(MutableMapping):
|
||||
if override and override in self._seen_overrides and var in self._seen_overrides[override]:
|
||||
self._seen_overrides[override].remove(var)
|
||||
|
||||
def setVarFlag(self, var, flag, value, **loginfo):
|
||||
if 'op' not in loginfo:
|
||||
loginfo['op'] = "set"
|
||||
loginfo['flag'] = flag
|
||||
self.varhistory.record(**loginfo)
|
||||
def setVarFlag(self, var, flag, flagvalue):
|
||||
if not var in self.dict:
|
||||
self._makeShadowCopy(var)
|
||||
self.dict[var][flag] = value
|
||||
self.dict[var][flag] = flagvalue
|
||||
|
||||
def getVarFlag(self, var, flag, expand=False, noweakdefault=False):
|
||||
local_var = self._findVar(var)
|
||||
@@ -578,7 +349,7 @@ class DataSmart(MutableMapping):
|
||||
value = self.expand(value, None)
|
||||
return value
|
||||
|
||||
def delVarFlag(self, var, flag, **loginfo):
|
||||
def delVarFlag(self, var, flag):
|
||||
local_var = self._findVar(var)
|
||||
if not local_var:
|
||||
return
|
||||
@@ -586,38 +357,23 @@ class DataSmart(MutableMapping):
|
||||
self._makeShadowCopy(var)
|
||||
|
||||
if var in self.dict and flag in self.dict[var]:
|
||||
loginfo['detail'] = ""
|
||||
loginfo['op'] = 'delFlag'
|
||||
loginfo['flag'] = flag
|
||||
self.varhistory.record(**loginfo)
|
||||
|
||||
del self.dict[var][flag]
|
||||
|
||||
def appendVarFlag(self, var, flag, value, **loginfo):
|
||||
loginfo['op'] = 'append'
|
||||
loginfo['flag'] = flag
|
||||
self.varhistory.record(**loginfo)
|
||||
newvalue = (self.getVarFlag(var, flag, False) or "") + value
|
||||
self.setVarFlag(var, flag, newvalue, ignore=True)
|
||||
def appendVarFlag(self, key, flag, value):
|
||||
value = (self.getVarFlag(key, flag, False) or "") + value
|
||||
self.setVarFlag(key, flag, value)
|
||||
|
||||
def prependVarFlag(self, var, flag, value, **loginfo):
|
||||
loginfo['op'] = 'prepend'
|
||||
loginfo['flag'] = flag
|
||||
self.varhistory.record(**loginfo)
|
||||
newvalue = value + (self.getVarFlag(var, flag, False) or "")
|
||||
self.setVarFlag(var, flag, newvalue, ignore=True)
|
||||
def prependVarFlag(self, key, flag, value):
|
||||
value = value + (self.getVarFlag(key, flag, False) or "")
|
||||
self.setVarFlag(key, flag, value)
|
||||
|
||||
def setVarFlags(self, var, flags, **loginfo):
|
||||
infer_caller_details(loginfo)
|
||||
def setVarFlags(self, var, flags):
|
||||
if not var in self.dict:
|
||||
self._makeShadowCopy(var)
|
||||
|
||||
for i in flags:
|
||||
if i == "_content":
|
||||
continue
|
||||
loginfo['flag'] = i
|
||||
loginfo['detail'] = flags[i]
|
||||
self.varhistory.record(**loginfo)
|
||||
self.dict[var][i] = flags[i]
|
||||
|
||||
def getVarFlags(self, var):
|
||||
@@ -635,16 +391,13 @@ class DataSmart(MutableMapping):
|
||||
return flags
|
||||
|
||||
|
||||
def delVarFlags(self, var, **loginfo):
|
||||
def delVarFlags(self, var):
|
||||
if not var in self.dict:
|
||||
self._makeShadowCopy(var)
|
||||
|
||||
if var in self.dict:
|
||||
content = None
|
||||
|
||||
loginfo['op'] = 'delete flags'
|
||||
self.varhistory.record(**loginfo)
|
||||
|
||||
# try to save the content
|
||||
if "_content" in self.dict[var]:
|
||||
content = self.dict[var]["_content"]
|
||||
@@ -661,11 +414,6 @@ class DataSmart(MutableMapping):
|
||||
# we really want this to be a DataSmart...
|
||||
data = DataSmart(seen=self._seen_overrides.copy(), special=self._special_values.copy())
|
||||
data.dict["_data"] = self.dict
|
||||
data.varhistory = self.varhistory.copy()
|
||||
data.varhistory.datasmart = data
|
||||
data.inchistory = self.inchistory.copy()
|
||||
|
||||
data._tracking = self._tracking
|
||||
|
||||
return data
|
||||
|
||||
@@ -726,33 +474,13 @@ class DataSmart(MutableMapping):
|
||||
|
||||
def get_hash(self):
|
||||
data = {}
|
||||
d = self.createCopy()
|
||||
bb.data.expandKeys(d)
|
||||
bb.data.update_data(d)
|
||||
|
||||
config_whitelist = set((d.getVar("BB_HASHCONFIG_WHITELIST", True) or "").split())
|
||||
keys = set(key for key in iter(d) if not key.startswith("__"))
|
||||
config_whitelist = set((self.getVar("BB_HASHCONFIG_WHITELIST", True) or "").split())
|
||||
keys = set(key for key in iter(self) if not key.startswith("__"))
|
||||
for key in keys:
|
||||
if key in config_whitelist:
|
||||
continue
|
||||
value = d.getVar(key, False) or ""
|
||||
value = self.getVar(key, False) or ""
|
||||
data.update({key:value})
|
||||
|
||||
varflags = d.getVarFlags(key)
|
||||
if not varflags:
|
||||
continue
|
||||
for f in varflags:
|
||||
data.update({'%s[%s]' % (key, f):varflags[f]})
|
||||
|
||||
for key in ["__BBTASKS", "__BBANONFUNCS", "__BBHANDLERS"]:
|
||||
bb_list = d.getVar(key, False) or []
|
||||
bb_list.sort()
|
||||
data.update({key:str(bb_list)})
|
||||
|
||||
if key == "__BBANONFUNCS":
|
||||
for i in bb_list:
|
||||
value = d.getVar(i, True) or ""
|
||||
data.update({i:value})
|
||||
|
||||
data_str = str([(k, data[k]) for k in sorted(data.keys())])
|
||||
return hashlib.md5(data_str).hexdigest()
|
||||
|
||||
@@ -566,19 +566,3 @@ class SanityCheckFailed(Event):
|
||||
Event.__init__(self)
|
||||
self._msg = msg
|
||||
self._network_error = network_error
|
||||
|
||||
class NetworkTest(Event):
|
||||
"""
|
||||
Event to start network test
|
||||
"""
|
||||
|
||||
class NetworkTestPassed(Event):
|
||||
"""
|
||||
Event to indicate network test has passed
|
||||
"""
|
||||
|
||||
class NetworkTestFailed(Event):
|
||||
"""
|
||||
Event to indicate network test has failed
|
||||
"""
|
||||
|
||||
|
||||
@@ -30,11 +30,6 @@ from __future__ import print_function
|
||||
import os, re
|
||||
import logging
|
||||
import urllib
|
||||
import urlparse
|
||||
if 'git' not in urlparse.uses_netloc:
|
||||
urlparse.uses_netloc.append('git')
|
||||
from urlparse import urlparse
|
||||
import operator
|
||||
import bb.persist_data, bb.utils
|
||||
import bb.checksum
|
||||
from bb import data
|
||||
@@ -74,9 +69,6 @@ class FetchError(BBFetchException):
|
||||
|
||||
class ChecksumError(FetchError):
|
||||
"""Exception when mismatched checksum encountered"""
|
||||
def __init__(self, message, url = None, checksum = None):
|
||||
self.checksum = checksum
|
||||
FetchError.__init__(self, message, url)
|
||||
|
||||
class NoChecksumError(FetchError):
|
||||
"""Exception when no checksum is specified, but BB_STRICT_CHECKSUM is set"""
|
||||
@@ -127,205 +119,12 @@ class NonLocalMethod(Exception):
|
||||
def __init__(self):
|
||||
Exception.__init__(self)
|
||||
|
||||
|
||||
class URI(object):
|
||||
"""
|
||||
A class representing a generic URI, with methods for
|
||||
accessing the URI components, and stringifies to the
|
||||
URI.
|
||||
|
||||
It is constructed by calling it with a URI, or setting
|
||||
the attributes manually:
|
||||
|
||||
uri = URI("http://example.com/")
|
||||
|
||||
uri = URI()
|
||||
uri.scheme = 'http'
|
||||
uri.hostname = 'example.com'
|
||||
uri.path = '/'
|
||||
|
||||
It has the following attributes:
|
||||
|
||||
* scheme (read/write)
|
||||
* userinfo (authentication information) (read/write)
|
||||
* username (read/write)
|
||||
* password (read/write)
|
||||
|
||||
Note, password is deprecated as of RFC 3986.
|
||||
|
||||
* hostname (read/write)
|
||||
* port (read/write)
|
||||
* hostport (read only)
|
||||
"hostname:port", if both are set, otherwise just "hostname"
|
||||
* path (read/write)
|
||||
* path_quoted (read/write)
|
||||
A URI quoted version of path
|
||||
* params (dict) (read/write)
|
||||
* relative (bool) (read only)
|
||||
True if this is a "relative URI", (e.g. file:foo.diff)
|
||||
|
||||
It stringifies to the URI itself.
|
||||
|
||||
Some notes about relative URIs: while it's specified that
|
||||
a URI beginning with <scheme>:// should either be directly
|
||||
followed by a hostname or a /, the old URI handling of the
|
||||
fetch2 library did not comform to this. Therefore, this URI
|
||||
class has some kludges to make sure that URIs are parsed in
|
||||
a way comforming to bitbake's current usage. This URI class
|
||||
supports the following:
|
||||
|
||||
file:relative/path.diff (IETF compliant)
|
||||
git:relative/path.git (IETF compliant)
|
||||
git:///absolute/path.git (IETF compliant)
|
||||
file:///absolute/path.diff (IETF compliant)
|
||||
|
||||
file://relative/path.diff (not IETF compliant)
|
||||
|
||||
But it does not support the following:
|
||||
|
||||
file://hostname/absolute/path.diff (would be IETF compliant)
|
||||
|
||||
Note that the last case only applies to a list of
|
||||
"whitelisted" schemes (currently only file://), that requires
|
||||
its URIs to not have a network location.
|
||||
"""
|
||||
|
||||
_relative_schemes = ['file', 'git']
|
||||
_netloc_forbidden = ['file']
|
||||
|
||||
def __init__(self, uri=None):
|
||||
self.scheme = ''
|
||||
self.userinfo = ''
|
||||
self.hostname = ''
|
||||
self.port = None
|
||||
self._path = ''
|
||||
self.params = {}
|
||||
self.relative = False
|
||||
|
||||
if not uri:
|
||||
return
|
||||
|
||||
urlp = urlparse(uri)
|
||||
self.scheme = urlp.scheme
|
||||
|
||||
# Convert URI to be relative
|
||||
if urlp.scheme in self._netloc_forbidden:
|
||||
uri = re.sub("(?<=:)//(?!/)", "", uri, 1)
|
||||
urlp = urlparse(uri)
|
||||
|
||||
# Identify if the URI is relative or not
|
||||
if urlp.scheme in self._relative_schemes and \
|
||||
re.compile("^\w+:(?!//)").match(uri):
|
||||
self.relative = True
|
||||
|
||||
if not self.relative:
|
||||
self.hostname = urlp.hostname or ''
|
||||
self.port = urlp.port
|
||||
|
||||
self.userinfo += urlp.username or ''
|
||||
|
||||
if urlp.password:
|
||||
self.userinfo += ':%s' % urlp.password
|
||||
|
||||
# Do support params even for URI schemes that Python's
|
||||
# urlparse doesn't support params for.
|
||||
path = ''
|
||||
param_str = ''
|
||||
if not urlp.params:
|
||||
path, param_str = (list(urlp.path.split(";", 1)) + [None])[:2]
|
||||
else:
|
||||
path = urlp.path
|
||||
param_str = urlp.params
|
||||
|
||||
self.path = urllib.unquote(path)
|
||||
|
||||
if param_str:
|
||||
self.params = self._param_dict(param_str)
|
||||
|
||||
def __str__(self):
|
||||
userinfo = self.userinfo
|
||||
if userinfo:
|
||||
userinfo += '@'
|
||||
|
||||
return "%s:%s%s%s%s%s" % (
|
||||
self.scheme,
|
||||
'' if self.relative else '//',
|
||||
userinfo,
|
||||
self.hostport,
|
||||
self.path_quoted,
|
||||
self._param_str)
|
||||
|
||||
@property
|
||||
def _param_str(self):
|
||||
ret = ''
|
||||
for key, val in self.params.items():
|
||||
ret += ";%s=%s" % (key, val)
|
||||
return ret
|
||||
|
||||
def _param_dict(self, param_str):
|
||||
parm = {}
|
||||
|
||||
for keyval in param_str.split(";"):
|
||||
key, val = keyval.split("=", 1)
|
||||
parm[key] = val
|
||||
|
||||
return parm
|
||||
|
||||
@property
|
||||
def hostport(self):
|
||||
if not self.port:
|
||||
return self.hostname
|
||||
return "%s:%d" % (self.hostname, self.port)
|
||||
|
||||
@property
|
||||
def path_quoted(self):
|
||||
return urllib.quote(self.path)
|
||||
|
||||
@path_quoted.setter
|
||||
def path_quoted(self, path):
|
||||
self.path = urllib.unquote(path)
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
return self._path
|
||||
|
||||
@path.setter
|
||||
def path(self, path):
|
||||
self._path = path
|
||||
|
||||
if re.compile("^/").match(path):
|
||||
self.relative = False
|
||||
else:
|
||||
self.relative = True
|
||||
|
||||
@property
|
||||
def username(self):
|
||||
if self.userinfo:
|
||||
return (self.userinfo.split(":", 1))[0]
|
||||
return ''
|
||||
|
||||
@username.setter
|
||||
def username(self, username):
|
||||
self.userinfo = username
|
||||
if self.password:
|
||||
self.userinfo += ":%s" % self.password
|
||||
|
||||
@property
|
||||
def password(self):
|
||||
if self.userinfo and ":" in self.userinfo:
|
||||
return (self.userinfo.split(":", 1))[1]
|
||||
return ''
|
||||
|
||||
@password.setter
|
||||
def password(self, password):
|
||||
self.userinfo = "%s:%s" % (self.username, password)
|
||||
|
||||
def decodeurl(url):
|
||||
"""Decodes an URL into the tokens (scheme, network location, path,
|
||||
user, password, parameters).
|
||||
"""
|
||||
|
||||
m = re.compile('(?P<type>[^:]*)://((?P<user>[^/]+)@)?(?P<location>[^;]+)(;(?P<parm>.*))?').match(url)
|
||||
m = re.compile('(?P<type>[^:]*)://((?P<user>.+)@)?(?P<location>[^;]+)(;(?P<parm>.*))?').match(url)
|
||||
if not m:
|
||||
raise MalformedUrl(url)
|
||||
|
||||
@@ -414,8 +213,6 @@ def uri_replace(ud, uri_find, uri_replace, replacements, d):
|
||||
return None
|
||||
# Overwrite any specified replacement parameters
|
||||
for k in uri_replace_decoded[loc]:
|
||||
for l in replacements:
|
||||
uri_replace_decoded[loc][k] = uri_replace_decoded[loc][k].replace(l, replacements[l])
|
||||
result_decoded[loc][k] = uri_replace_decoded[loc][k]
|
||||
elif (re.match(regexp, uri_decoded[loc])):
|
||||
if not uri_replace_decoded[loc]:
|
||||
@@ -564,7 +361,7 @@ def verify_checksum(u, ud, d):
|
||||
msg = msg + '\nIf this change is expected (e.g. you have upgraded to a new version without updating the checksums) then you can use these lines within the recipe:\nSRC_URI[%s] = "%s"\nSRC_URI[%s] = "%s"\nOtherwise you should retry the download and/or check with upstream to determine if the file has become corrupted or otherwise unexpectedly modified.\n' % (ud.md5_name, md5data, ud.sha256_name, sha256data)
|
||||
|
||||
if len(msg):
|
||||
raise ChecksumError('Checksum mismatch!%s' % msg, u, md5data)
|
||||
raise ChecksumError('Checksum mismatch!%s' % msg, u)
|
||||
|
||||
|
||||
def update_stamp(u, ud, d):
|
||||
@@ -625,18 +422,10 @@ def get_srcrev(d):
|
||||
if not format:
|
||||
raise FetchError("The SRCREV_FORMAT variable must be set when multiple SCMs are used.")
|
||||
|
||||
autoinc = False
|
||||
autoinc_templ = 'AUTOINC+'
|
||||
for scm in scms:
|
||||
ud = urldata[scm]
|
||||
for name in ud.names:
|
||||
rev = ud.method.sortable_revision(scm, ud, d, name)
|
||||
if rev.startswith(autoinc_templ):
|
||||
if not autoinc:
|
||||
autoinc = True
|
||||
format = "%s%s" % (autoinc_templ, format)
|
||||
rev = rev[len(autoinc_templ):]
|
||||
|
||||
format = format.replace(name, rev)
|
||||
|
||||
return format
|
||||
@@ -660,16 +449,11 @@ def runfetchcmd(cmd, d, quiet = False, cleanup = []):
|
||||
# rather than host provided
|
||||
# Also include some other variables.
|
||||
# FIXME: Should really include all export varaiables?
|
||||
exportvars = ['HOME', 'PATH',
|
||||
'HTTP_PROXY', 'http_proxy',
|
||||
'HTTPS_PROXY', 'https_proxy',
|
||||
'FTP_PROXY', 'ftp_proxy',
|
||||
'FTPS_PROXY', 'ftps_proxy',
|
||||
'NO_PROXY', 'no_proxy',
|
||||
'ALL_PROXY', 'all_proxy',
|
||||
'GIT_PROXY_COMMAND',
|
||||
'SSH_AUTH_SOCK', 'SSH_AGENT_PID',
|
||||
'SOCKS5_USER', 'SOCKS5_PASSWD']
|
||||
exportvars = ['PATH', 'GIT_PROXY_COMMAND', 'GIT_PROXY_HOST',
|
||||
'GIT_PROXY_PORT', 'GIT_CONFIG', 'http_proxy', 'ftp_proxy',
|
||||
'https_proxy', 'no_proxy', 'ALL_PROXY', 'all_proxy',
|
||||
'SSH_AUTH_SOCK', 'SSH_AGENT_PID', 'HOME',
|
||||
'GIT_PROXY_IGNORE', 'SOCKS5_USER', 'SOCKS5_PASSWD']
|
||||
|
||||
for var in exportvars:
|
||||
val = d.getVar(var, True)
|
||||
@@ -756,19 +540,6 @@ def build_mirroruris(origud, mirrors, ld):
|
||||
|
||||
return uris, uds
|
||||
|
||||
def rename_bad_checksum(ud, suffix):
|
||||
"""
|
||||
Renames files to have suffix from parameter
|
||||
"""
|
||||
|
||||
if ud.localpath is None:
|
||||
return
|
||||
|
||||
new_localpath = "%s_bad-checksum_%s" % (ud.localpath, suffix)
|
||||
bb.warn("Renaming %s to %s" % (ud.localpath, new_localpath))
|
||||
bb.utils.movefile(ud.localpath, new_localpath)
|
||||
|
||||
|
||||
def try_mirror_url(newuri, origud, ud, ld, check = False):
|
||||
# Return of None or a value means we're finished
|
||||
# False means try another url
|
||||
@@ -797,7 +568,6 @@ def try_mirror_url(newuri, origud, ud, ld, check = False):
|
||||
dldir = ld.getVar("DL_DIR", True)
|
||||
if origud.mirrortarball and os.path.basename(ud.localpath) == os.path.basename(origud.mirrortarball) \
|
||||
and os.path.basename(ud.localpath) != os.path.basename(origud.localpath):
|
||||
bb.utils.mkdirhier(os.path.dirname(ud.donestamp))
|
||||
open(ud.donestamp, 'w').close()
|
||||
dest = os.path.join(dldir, os.path.basename(ud.localpath))
|
||||
if not os.path.exists(dest):
|
||||
@@ -820,7 +590,6 @@ def try_mirror_url(newuri, origud, ud, ld, check = False):
|
||||
if isinstance(e, ChecksumError):
|
||||
logger.warn("Mirror checksum failure for url %s (original url: %s)\nCleaning and trying again." % (newuri, origud.url))
|
||||
logger.warn(str(e))
|
||||
rename_bad_checksum(ud, e.checksum)
|
||||
elif isinstance(e, NoChecksumError):
|
||||
raise
|
||||
else:
|
||||
@@ -872,14 +641,11 @@ def srcrev_internal_helper(ud, d, name):
|
||||
if not rev:
|
||||
rev = d.getVar("SRCREV_%s" % name, True)
|
||||
if not rev:
|
||||
rev = d.getVar("SRCREV_pn-%s" % pn, True)
|
||||
rev = d.getVar("SRCREV_pn-%s" % pn, True)
|
||||
if not rev:
|
||||
rev = d.getVar("SRCREV", True)
|
||||
if rev == "INVALID":
|
||||
var = "SRCREV_pn-%s" % pn
|
||||
if name != '':
|
||||
var = "SRCREV_%s_pn-%s" % (name, pn)
|
||||
raise FetchError("Please set %s to a valid value" % var, ud.url)
|
||||
raise FetchError("Please set SRCREV to a valid value", ud.url)
|
||||
if rev == "AUTOINC":
|
||||
rev = ud.method.latest_revision(ud.url, ud, d, name)
|
||||
|
||||
@@ -955,7 +721,7 @@ def get_file_checksums(filelist, pn):
|
||||
if checksum:
|
||||
checksums.append((pth, checksum))
|
||||
|
||||
checksums.sort(key=operator.itemgetter(1))
|
||||
checksums.sort()
|
||||
return checksums
|
||||
|
||||
|
||||
@@ -971,7 +737,6 @@ class FetchData(object):
|
||||
self.lockfile = None
|
||||
self.mirrortarball = None
|
||||
self.basename = None
|
||||
self.basepath = None
|
||||
(self.type, self.host, self.path, self.user, self.pswd, self.parm) = decodeurl(data.expand(url, d))
|
||||
self.date = self.getSRCDate(d)
|
||||
self.url = url
|
||||
@@ -989,13 +754,13 @@ class FetchData(object):
|
||||
self.sha256_name = "sha256sum"
|
||||
if self.md5_name in self.parm:
|
||||
self.md5_expected = self.parm[self.md5_name]
|
||||
elif self.type not in ["http", "https", "ftp", "ftps", "sftp"]:
|
||||
elif self.type not in ["http", "https", "ftp", "ftps"]:
|
||||
self.md5_expected = None
|
||||
else:
|
||||
self.md5_expected = d.getVarFlag("SRC_URI", self.md5_name)
|
||||
if self.sha256_name in self.parm:
|
||||
self.sha256_expected = self.parm[self.sha256_name]
|
||||
elif self.type not in ["http", "https", "ftp", "ftps", "sftp"]:
|
||||
elif self.type not in ["http", "https", "ftp", "ftps"]:
|
||||
self.sha256_expected = None
|
||||
else:
|
||||
self.sha256_expected = d.getVarFlag("SRC_URI", self.sha256_name)
|
||||
@@ -1028,14 +793,8 @@ class FetchData(object):
|
||||
elif self.localfile:
|
||||
self.localpath = self.method.localpath(self.url, self, d)
|
||||
|
||||
dldir = d.getVar("DL_DIR", True)
|
||||
# Note: .done and .lock files should always be in DL_DIR whereas localpath may not be.
|
||||
if self.localpath and self.localpath.startswith(dldir):
|
||||
basepath = self.localpath
|
||||
elif self.localpath:
|
||||
basepath = dldir + os.sep + os.path.basename(self.localpath)
|
||||
else:
|
||||
basepath = dldir + os.sep + (self.basepath or self.basename)
|
||||
# Note: These files should always be in DL_DIR whereas localpath may not be.
|
||||
basepath = d.expand("${DL_DIR}/%s" % os.path.basename(self.localpath or self.basename))
|
||||
self.donestamp = basepath + '.done'
|
||||
self.lockfile = basepath + '.lock'
|
||||
|
||||
@@ -1204,18 +963,15 @@ class FetchMethod(object):
|
||||
dest = os.path.join(rootdir, os.path.basename(file))
|
||||
if (file != dest) and not (os.path.exists(dest) and os.path.samefile(file, dest)):
|
||||
if os.path.isdir(file):
|
||||
# If for example we're asked to copy file://foo/bar, we need to unpack the result into foo/bar
|
||||
basepath = getattr(urldata, "basepath", None)
|
||||
filesdir = os.path.realpath(data.getVar("FILESDIR", True))
|
||||
destdir = "."
|
||||
if basepath and basepath.endswith("/"):
|
||||
basepath = basepath.rstrip("/")
|
||||
elif basepath:
|
||||
basepath = os.path.dirname(basepath)
|
||||
if basepath and basepath.find("/") != -1:
|
||||
destdir = basepath[:basepath.rfind('/')]
|
||||
if file[0:len(filesdir)] == filesdir:
|
||||
destdir = file[len(filesdir):file.rfind('/')]
|
||||
destdir = destdir.strip('/')
|
||||
if destdir != "." and not os.access("%s/%s" % (rootdir, destdir), os.F_OK):
|
||||
os.makedirs("%s/%s" % (rootdir, destdir))
|
||||
if len(destdir) < 1:
|
||||
destdir = "."
|
||||
elif not os.access("%s/%s" % (rootdir, destdir), os.F_OK):
|
||||
os.makedirs("%s/%s" % (rootdir, destdir))
|
||||
cmd = 'cp -pPR %s %s/%s/' % (file, rootdir, destdir)
|
||||
#cmd = 'tar -cf - -C "%d" -ps . | tar -xf - -C "%s/%s/"' % (file, rootdir, destdir)
|
||||
else:
|
||||
@@ -1277,6 +1033,23 @@ class FetchMethod(object):
|
||||
logger.info("URL %s could not be checked for status since no method exists.", url)
|
||||
return True
|
||||
|
||||
def localcount_internal_helper(ud, d, name):
|
||||
"""
|
||||
Return:
|
||||
a) a locked localcount if specified
|
||||
b) None otherwise
|
||||
"""
|
||||
|
||||
localcount = None
|
||||
if name != '':
|
||||
pn = d.getVar("PN", True)
|
||||
localcount = d.getVar("LOCALCOUNT_" + name, True)
|
||||
if not localcount:
|
||||
localcount = d.getVar("LOCALCOUNT", True)
|
||||
return localcount
|
||||
|
||||
localcount_internal_helper = staticmethod(localcount_internal_helper)
|
||||
|
||||
def latest_revision(self, url, ud, d, name):
|
||||
"""
|
||||
Look in the cache for the latest revision, if not present ask the SCM.
|
||||
@@ -1299,8 +1072,36 @@ class FetchMethod(object):
|
||||
if hasattr(self, "_sortable_revision"):
|
||||
return self._sortable_revision(url, ud, d)
|
||||
|
||||
localcounts = bb.persist_data.persist('BB_URI_LOCALCOUNT', d)
|
||||
key = self.generate_revision_key(url, ud, d, name)
|
||||
|
||||
latest_rev = self._build_revision(url, ud, d, name)
|
||||
return 'AUTOINC+%s' % str(latest_rev)
|
||||
last_rev = localcounts.get(key + '_rev')
|
||||
uselocalcount = d.getVar("BB_LOCALCOUNT_OVERRIDE", True) or False
|
||||
count = None
|
||||
if uselocalcount:
|
||||
count = FetchMethod.localcount_internal_helper(ud, d, name)
|
||||
if count is None:
|
||||
count = localcounts.get(key + '_count') or "0"
|
||||
|
||||
if last_rev == latest_rev:
|
||||
return str(count + "+" + latest_rev)
|
||||
|
||||
buildindex_provided = hasattr(self, "_sortable_buildindex")
|
||||
if buildindex_provided:
|
||||
count = self._sortable_buildindex(url, ud, d, latest_rev)
|
||||
|
||||
if count is None:
|
||||
count = "0"
|
||||
elif uselocalcount or buildindex_provided:
|
||||
count = str(count)
|
||||
else:
|
||||
count = str(int(count) + 1)
|
||||
|
||||
localcounts[key + '_rev'] = latest_rev
|
||||
localcounts[key + '_count'] = count
|
||||
|
||||
return str(count + "+" + latest_rev)
|
||||
|
||||
def generate_revision_key(self, url, ud, d, name):
|
||||
key = self._revision_key(url, ud, d, name)
|
||||
@@ -1405,7 +1206,6 @@ class Fetch(object):
|
||||
if isinstance(e, ChecksumError):
|
||||
logger.warn("Checksum failure encountered with download of %s - will attempt other sources if available" % u)
|
||||
logger.debug(1, str(e))
|
||||
rename_bad_checksum(ud, e.checksum)
|
||||
elif isinstance(e, NoChecksumError):
|
||||
raise
|
||||
else:
|
||||
@@ -1515,13 +1315,11 @@ class Fetch(object):
|
||||
|
||||
from . import cvs
|
||||
from . import git
|
||||
from . import gitsm
|
||||
from . import local
|
||||
from . import svn
|
||||
from . import wget
|
||||
from . import svk
|
||||
from . import ssh
|
||||
from . import sftp
|
||||
from . import perforce
|
||||
from . import bzr
|
||||
from . import hg
|
||||
@@ -1532,11 +1330,9 @@ methods.append(local.Local())
|
||||
methods.append(wget.Wget())
|
||||
methods.append(svn.Svn())
|
||||
methods.append(git.Git())
|
||||
methods.append(gitsm.GitSM())
|
||||
methods.append(cvs.Cvs())
|
||||
methods.append(svk.Svk())
|
||||
methods.append(ssh.SSH())
|
||||
methods.append(sftp.SFTP())
|
||||
methods.append(perforce.Perforce())
|
||||
methods.append(bzr.Bzr())
|
||||
methods.append(hg.Hg())
|
||||
|
||||
@@ -11,8 +11,8 @@ Supported SRC_URI options are:
|
||||
- branch
|
||||
The git branch to retrieve from. The default is "master"
|
||||
|
||||
This option also supports multiple branch fetching, with branches
|
||||
separated by commas. In multiple branches case, the name option
|
||||
this option also support multiple branches fetching, branches
|
||||
are seperated by comma. in multiple branches case, the name option
|
||||
must have the same number of names to match the branches, which is
|
||||
used to specify the SRC_REV for the branch
|
||||
e.g:
|
||||
@@ -25,13 +25,13 @@ Supported SRC_URI options are:
|
||||
|
||||
- protocol
|
||||
The method to use to access the repository. Common options are "git",
|
||||
"http", "https", "file", "ssh" and "rsync". The default is "git".
|
||||
"http", "file" and "rsync". The default is "git"
|
||||
|
||||
- rebaseable
|
||||
rebaseable indicates that the upstream git repo may rebase in the future,
|
||||
and current revision may disappear from upstream repo. This option will
|
||||
remind fetcher to preserve local cache carefully for future use.
|
||||
The default value is "0", set rebaseable=1 for rebaseable git repo.
|
||||
reminder fetcher to preserve local cache carefully for future use.
|
||||
The default value is "0", set rebaseable=1 for rebaseable git repo
|
||||
|
||||
- nocheckout
|
||||
Don't checkout source code when unpacking. set this option for the recipe
|
||||
@@ -71,8 +71,11 @@ from bb.fetch2 import logger
|
||||
class Git(FetchMethod):
|
||||
"""Class to fetch a module or modules from git repositories"""
|
||||
def init(self, d):
|
||||
pass
|
||||
|
||||
#
|
||||
# Only enable _sortable revision if the key is set
|
||||
#
|
||||
if d.getVar("BB_GIT_CLONE_FOR_SRCREV", True):
|
||||
self._sortable_buildindex = self._sortable_buildindex_disabled
|
||||
def supports(self, url, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with git.
|
||||
@@ -217,10 +220,6 @@ class Git(FetchMethod):
|
||||
def build_mirror_data(self, url, ud, d):
|
||||
# Generate a mirror tarball if needed
|
||||
if ud.write_tarballs and (ud.repochanged or not os.path.exists(ud.fullmirror)):
|
||||
# it's possible that this symlink points to read-only filesystem with PREMIRROR
|
||||
if os.path.islink(ud.fullmirror):
|
||||
os.unlink(ud.fullmirror)
|
||||
|
||||
os.chdir(ud.clonedir)
|
||||
logger.info("Creating tarball of git repository")
|
||||
runfetchcmd("tar -czf %s %s" % (ud.fullmirror, os.path.join(".") ), d)
|
||||
@@ -238,7 +237,7 @@ class Git(FetchMethod):
|
||||
def_destsuffix = "git/"
|
||||
|
||||
destsuffix = ud.parm.get("destsuffix", def_destsuffix)
|
||||
destdir = ud.destdir = os.path.join(destdir, destsuffix)
|
||||
destdir = os.path.join(destdir, destsuffix)
|
||||
if os.path.exists(destdir):
|
||||
bb.utils.prunedir(destdir)
|
||||
|
||||
@@ -317,6 +316,38 @@ class Git(FetchMethod):
|
||||
def _build_revision(self, url, ud, d, name):
|
||||
return ud.revisions[name]
|
||||
|
||||
def _sortable_buildindex_disabled(self, url, ud, d, rev):
|
||||
"""
|
||||
Return a suitable buildindex for the revision specified. This is done by counting revisions
|
||||
using "git rev-list" which may or may not work in different circumstances.
|
||||
"""
|
||||
|
||||
cwd = os.getcwd()
|
||||
|
||||
# Check if we have the rev already
|
||||
|
||||
if not os.path.exists(ud.clonedir):
|
||||
logger.debug(1, "GIT repository for %s does not exist in %s. \
|
||||
Downloading.", url, ud.clonedir)
|
||||
self.download(None, ud, d)
|
||||
if not os.path.exists(ud.clonedir):
|
||||
logger.error("GIT repository for %s does not exist in %s after \
|
||||
download. Cannot get sortable buildnumber, using \
|
||||
old value", url, ud.clonedir)
|
||||
return None
|
||||
|
||||
|
||||
os.chdir(ud.clonedir)
|
||||
if not self._contains_ref(rev, d):
|
||||
self.download(None, ud, d)
|
||||
|
||||
output = runfetchcmd("%s rev-list %s -- 2> /dev/null | wc -l" % (ud.basecmd, rev), d, quiet=True)
|
||||
os.chdir(cwd)
|
||||
|
||||
buildindex = "%s" % output.split()[0]
|
||||
logger.debug(1, "GIT repository for %s in %s is returning %s revisions in rev-list before %s", url, ud.clonedir, buildindex, rev)
|
||||
return buildindex
|
||||
|
||||
def checkstatus(self, uri, ud, d):
|
||||
fetchcmd = "%s ls-remote %s" % (ud.basecmd, uri)
|
||||
try:
|
||||
|
||||
@@ -1,78 +0,0 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
"""
|
||||
BitBake 'Fetch' git submodules implementation
|
||||
"""
|
||||
|
||||
# Copyright (C) 2013 Richard Purdie
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import os
|
||||
import bb
|
||||
from bb import data
|
||||
from bb.fetch2.git import Git
|
||||
from bb.fetch2 import runfetchcmd
|
||||
from bb.fetch2 import logger
|
||||
|
||||
class GitSM(Git):
|
||||
def supports(self, url, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with git.
|
||||
"""
|
||||
return ud.type in ['gitsm']
|
||||
|
||||
def uses_submodules(self, ud, d):
|
||||
for name in ud.names:
|
||||
try:
|
||||
runfetchcmd("%s show %s:.gitmodules" % (ud.basecmd, ud.revisions[name]), d, quiet=True)
|
||||
return True
|
||||
except bb.fetch.FetchError:
|
||||
pass
|
||||
return False
|
||||
|
||||
def update_submodules(self, u, ud, d):
|
||||
# We have to convert bare -> full repo, do the submodule bit, then convert back
|
||||
tmpclonedir = ud.clonedir + ".tmp"
|
||||
gitdir = tmpclonedir + os.sep + ".git"
|
||||
bb.utils.remove(tmpclonedir, True)
|
||||
os.mkdir(tmpclonedir)
|
||||
os.rename(ud.clonedir, gitdir)
|
||||
runfetchcmd("sed " + gitdir + "/config -i -e 's/bare.*=.*true/bare = false/'", d)
|
||||
os.chdir(tmpclonedir)
|
||||
runfetchcmd("git reset --hard", d)
|
||||
runfetchcmd("git submodule init", d)
|
||||
runfetchcmd("git submodule update", d)
|
||||
runfetchcmd("sed " + gitdir + "/config -i -e 's/bare.*=.*false/bare = true/'", d)
|
||||
os.rename(gitdir, ud.clonedir,)
|
||||
bb.utils.remove(tmpclonedir, True)
|
||||
|
||||
def download(self, loc, ud, d):
|
||||
Git.download(self, loc, ud, d)
|
||||
|
||||
os.chdir(ud.clonedir)
|
||||
submodules = self.uses_submodules(ud, d)
|
||||
if submodules:
|
||||
self.update_submodules(loc, ud, d)
|
||||
|
||||
def unpack(self, ud, destdir, d):
|
||||
Git.unpack(self, ud, destdir, d)
|
||||
|
||||
os.chdir(ud.destdir)
|
||||
submodules = self.uses_submodules(ud, d)
|
||||
if submodules:
|
||||
runfetchcmd("cp -r " + ud.clonedir + "/modules " + ud.destdir + "/.git/", d)
|
||||
runfetchcmd("git submodule init", d)
|
||||
runfetchcmd("git submodule update", d)
|
||||
|
||||
@@ -92,21 +92,13 @@ class Hg(FetchMethod):
|
||||
if not ud.user:
|
||||
hgroot = host + ud.path
|
||||
else:
|
||||
if ud.pswd:
|
||||
hgroot = ud.user + ":" + ud.pswd + "@" + host + ud.path
|
||||
else:
|
||||
hgroot = ud.user + "@" + host + ud.path
|
||||
hgroot = ud.user + "@" + host + ud.path
|
||||
|
||||
if command == "info":
|
||||
return "%s identify -i %s://%s/%s" % (basecmd, proto, hgroot, ud.module)
|
||||
|
||||
options = [];
|
||||
|
||||
# Don't specify revision for the fetch; clone the entire repo.
|
||||
# This avoids an issue if the specified revision is a tag, because
|
||||
# the tag actually exists in the specified revision + 1, so it won't
|
||||
# be available when used in any successive commands.
|
||||
if ud.revision and command != "fetch":
|
||||
if ud.revision:
|
||||
options.append("-r %s" % ud.revision)
|
||||
|
||||
if command == "fetch":
|
||||
@@ -115,10 +107,7 @@ class Hg(FetchMethod):
|
||||
# do not pass options list; limiting pull to rev causes the local
|
||||
# repo not to contain it and immediately following "update" command
|
||||
# will crash
|
||||
if ud.user and ud.pswd:
|
||||
cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" pull" % (basecmd, ud.user, ud.pswd, proto)
|
||||
else:
|
||||
cmd = "%s pull" % (basecmd)
|
||||
cmd = "%s pull" % (basecmd)
|
||||
elif command == "update":
|
||||
cmd = "%s update -C %s" % (basecmd, " ".join(options))
|
||||
else:
|
||||
|
||||
@@ -44,7 +44,6 @@ class Local(FetchMethod):
|
||||
# We don't set localfile as for this fetcher the file is already local!
|
||||
ud.decodedurl = urllib.unquote(ud.url.split("://")[1].split(";")[0])
|
||||
ud.basename = os.path.basename(ud.decodedurl)
|
||||
ud.basepath = ud.decodedurl
|
||||
return
|
||||
|
||||
def localpath(self, url, urldata, d):
|
||||
@@ -63,12 +62,7 @@ class Local(FetchMethod):
|
||||
if filesdir:
|
||||
logger.debug(2, "Searching for %s in path: %s" % (path, filesdir))
|
||||
newpath = os.path.join(filesdir, path)
|
||||
if (not newpath or not os.path.exists(newpath)) and path.find("*") != -1:
|
||||
# For expressions using '*', best we can do is take the first directory in FILESPATH that exists
|
||||
newpath = bb.utils.which(filespath, ".")
|
||||
logger.debug(2, "Searching for %s in path: %s" % (path, newpath))
|
||||
return newpath
|
||||
if not os.path.exists(newpath):
|
||||
if not os.path.exists(newpath) and path.find("*") == -1:
|
||||
dldirfile = os.path.join(d.getVar("DL_DIR", True), path)
|
||||
logger.debug(2, "Defaulting to %s for %s" % (dldirfile, path))
|
||||
bb.utils.mkdirhier(os.path.dirname(dldirfile))
|
||||
|
||||
@@ -112,7 +112,7 @@ class Perforce(FetchMethod):
|
||||
base = path
|
||||
which = path.find('/...')
|
||||
if which != -1:
|
||||
base = path[:which-1]
|
||||
base = path[:which]
|
||||
|
||||
base = self._strip_leading_slashes(base)
|
||||
|
||||
@@ -170,7 +170,7 @@ class Perforce(FetchMethod):
|
||||
logger.info("Fetch " + loc)
|
||||
logger.info("%s%s files %s", p4cmd, p4opt, depot)
|
||||
p4file, errors = bb.process.run("%s%s files %s" % (p4cmd, p4opt, depot))
|
||||
p4file = [f.rstrip() for f in p4file.splitlines()]
|
||||
p4file = p4file.strip()
|
||||
|
||||
if not p4file:
|
||||
raise FetchError("Fetch: unable to get the P4 files from %s" % depot, loc)
|
||||
|
||||
@@ -1,129 +0,0 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
"""
|
||||
BitBake SFTP Fetch implementation
|
||||
|
||||
Class for fetching files via SFTP. It tries to adhere to the (now
|
||||
expired) IETF Internet Draft for "Uniform Resource Identifier (URI)
|
||||
Scheme for Secure File Transfer Protocol (SFTP) and Secure Shell
|
||||
(SSH)" (SECSH URI).
|
||||
|
||||
It uses SFTP (as to adhere to the SECSH URI specification). It only
|
||||
supports key based authentication, not password. This class, unlike
|
||||
the SSH fetcher, does not support fetching a directory tree from the
|
||||
remote.
|
||||
|
||||
http://tools.ietf.org/html/draft-ietf-secsh-scp-sftp-ssh-uri-04
|
||||
https://www.iana.org/assignments/uri-schemes/prov/sftp
|
||||
https://tools.ietf.org/html/draft-ietf-secsh-filexfer-13
|
||||
|
||||
Please note that '/' is used as host path seperator, and not ":"
|
||||
as you may be used to from the scp/sftp commands. You can use a
|
||||
~ (tilde) to specify a path relative to your home directory.
|
||||
(The /~user/ syntax, for specyfing a path relative to another
|
||||
user's home directory is not supported.) Note that the tilde must
|
||||
still follow the host path seperator ("/"). See exampels below.
|
||||
|
||||
Example SRC_URIs:
|
||||
|
||||
SRC_URI = "sftp://host.example.com/dir/path.file.txt"
|
||||
|
||||
A path relative to your home directory.
|
||||
|
||||
SRC_URI = "sftp://host.example.com/~/dir/path.file.txt"
|
||||
|
||||
You can also specify a username (specyfing password in the
|
||||
URI is not supported, use SSH keys to authenticate):
|
||||
|
||||
SRC_URI = "sftp://user@host.example.com/dir/path.file.txt"
|
||||
|
||||
"""
|
||||
|
||||
# Copyright (C) 2013, Olof Johansson <olof.johansson@axis.com>
|
||||
#
|
||||
# Based in part on bb.fetch2.wget:
|
||||
# Copyright (C) 2003, 2004 Chris Larson
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
#
|
||||
# Based on functions from the base bb module, Copyright 2003 Holger Schurig
|
||||
|
||||
import os
|
||||
import bb
|
||||
import urllib
|
||||
import commands
|
||||
from bb import data
|
||||
from bb.fetch2 import URI
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
|
||||
class SFTP(FetchMethod):
|
||||
"""Class to fetch urls via 'sftp'"""
|
||||
|
||||
def supports(self, url, ud, d):
|
||||
"""
|
||||
Check to see if a given url can be fetched with sftp.
|
||||
"""
|
||||
return ud.type in ['sftp']
|
||||
|
||||
def recommends_checksum(self, urldata):
|
||||
return True
|
||||
|
||||
def urldata_init(self, ud, d):
|
||||
if 'protocol' in ud.parm and ud.parm['protocol'] == 'git':
|
||||
raise bb.fetch2.ParameterError(
|
||||
"Invalid protocol - if you wish to fetch from a " +
|
||||
"git repository using ssh, you need to use the " +
|
||||
"git:// prefix with protocol=ssh", ud.url)
|
||||
|
||||
if 'downloadfilename' in ud.parm:
|
||||
ud.basename = ud.parm['downloadfilename']
|
||||
else:
|
||||
ud.basename = os.path.basename(ud.path)
|
||||
|
||||
ud.localfile = data.expand(urllib.unquote(ud.basename), d)
|
||||
|
||||
def download(self, uri, ud, d):
|
||||
"""Fetch urls"""
|
||||
|
||||
urlo = URI(uri)
|
||||
basecmd = 'sftp -oPasswordAuthentication=no'
|
||||
port = ''
|
||||
if urlo.port:
|
||||
port = '-P %d' % urlo.port
|
||||
urlo.port = None
|
||||
|
||||
dldir = data.getVar('DL_DIR', d, True)
|
||||
lpath = os.path.join(dldir, ud.localfile)
|
||||
|
||||
user = ''
|
||||
if urlo.userinfo:
|
||||
user = urlo.userinfo + '@'
|
||||
|
||||
path = urlo.path
|
||||
|
||||
# Supoprt URIs relative to the user's home directory, with
|
||||
# the tilde syntax. (E.g. <sftp://example.com/~/foo.diff>).
|
||||
if path[:3] == '/~/':
|
||||
path = path[3:]
|
||||
|
||||
remote = '%s%s:%s' % (user, urlo.hostname, path)
|
||||
|
||||
cmd = '%s %s %s %s' % (basecmd, port, commands.mkarg(remote),
|
||||
commands.mkarg(lpath))
|
||||
|
||||
bb.fetch2.check_network_access(d, cmd, uri)
|
||||
runfetchcmd(cmd, d)
|
||||
return True
|
||||
@@ -10,12 +10,6 @@ IETF secsh internet draft:
|
||||
Currently does not support the sftp parameters, as this uses scp
|
||||
Also does not support the 'fingerprint' connection parameter.
|
||||
|
||||
Please note that '/' is used as host, path separator not ':' as you may
|
||||
be used to, also '~' can be used to specify user HOME, but again after '/'
|
||||
|
||||
Example SRC_URI:
|
||||
SRC_URI = "ssh://user@host.example.com/dir/path/file.txt"
|
||||
SRC_URI = "ssh://user@host.example.com/~/file.txt"
|
||||
'''
|
||||
|
||||
# Copyright (C) 2006 OpenedHand Ltd.
|
||||
@@ -78,19 +72,15 @@ class SSH(FetchMethod):
|
||||
def supports_checksum(self, urldata):
|
||||
return False
|
||||
|
||||
def urldata_init(self, urldata, d):
|
||||
if 'protocol' in urldata.parm and urldata.parm['protocol'] == 'git':
|
||||
raise bb.fetch2.ParameterError(
|
||||
"Invalid protocol - if you wish to fetch from a git " +
|
||||
"repository using ssh, you need to use " +
|
||||
"git:// prefix with protocol=ssh", urldata.url)
|
||||
def localpath(self, url, urldata, d):
|
||||
m = __pattern__.match(urldata.url)
|
||||
path = m.group('path')
|
||||
host = m.group('host')
|
||||
urldata.localpath = os.path.join(d.getVar('DL_DIR', True), os.path.basename(path))
|
||||
lpath = os.path.join(data.getVar('DL_DIR', d, True), host, os.path.basename(path))
|
||||
return lpath
|
||||
|
||||
def download(self, url, urldata, d):
|
||||
dldir = d.getVar('DL_DIR', True)
|
||||
dldir = data.getVar('DL_DIR', d, True)
|
||||
|
||||
m = __pattern__.match(url)
|
||||
path = m.group('path')
|
||||
@@ -99,10 +89,16 @@ class SSH(FetchMethod):
|
||||
user = m.group('user')
|
||||
password = m.group('pass')
|
||||
|
||||
ldir = os.path.join(dldir, host)
|
||||
lpath = os.path.join(ldir, os.path.basename(path))
|
||||
|
||||
if not os.path.exists(ldir):
|
||||
os.makedirs(ldir)
|
||||
|
||||
if port:
|
||||
portarg = '-P %s' % port
|
||||
port = '-P %s' % port
|
||||
else:
|
||||
portarg = ''
|
||||
port = ''
|
||||
|
||||
if user:
|
||||
fr = user
|
||||
@@ -116,9 +112,9 @@ class SSH(FetchMethod):
|
||||
|
||||
import commands
|
||||
cmd = 'scp -B -r %s %s %s/' % (
|
||||
portarg,
|
||||
port,
|
||||
commands.mkarg(fr),
|
||||
commands.mkarg(dldir)
|
||||
commands.mkarg(ldir)
|
||||
)
|
||||
|
||||
bb.fetch2.check_network_access(d, cmd, urldata.url)
|
||||
|
||||
@@ -27,7 +27,6 @@ import os
|
||||
import sys
|
||||
import logging
|
||||
import bb
|
||||
import re
|
||||
from bb import data
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
@@ -90,8 +89,6 @@ class Svn(FetchMethod):
|
||||
|
||||
if command == "info":
|
||||
svncmd = "%s info %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
|
||||
elif command == "log1":
|
||||
svncmd = "%s log --limit 1 %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
|
||||
else:
|
||||
suffix = ""
|
||||
if ud.revision:
|
||||
@@ -168,13 +165,14 @@ class Svn(FetchMethod):
|
||||
"""
|
||||
Return the latest upstream revision number
|
||||
"""
|
||||
bb.fetch2.check_network_access(d, self._buildsvncommand(ud, d, "log1"))
|
||||
bb.fetch2.check_network_access(d, self._buildsvncommand(ud, d, "info"))
|
||||
|
||||
output = runfetchcmd("LANG=C LC_ALL=C " + self._buildsvncommand(ud, d, "log1"), d, True)
|
||||
output = runfetchcmd("LANG=C LC_ALL=C " + self._buildsvncommand(ud, d, "info"), d, True)
|
||||
|
||||
# skip the first line, as per output of svn log
|
||||
# then we expect the revision on the 2nd line
|
||||
revision = re.search('^r([0-9]*)', output.splitlines()[1]).group(1)
|
||||
revision = None
|
||||
for line in output.splitlines():
|
||||
if "Last Changed Rev" in line:
|
||||
revision = line.split(":")[1].strip()
|
||||
|
||||
return revision
|
||||
|
||||
|
||||
@@ -32,6 +32,8 @@ import urllib
|
||||
from bb import data
|
||||
from bb.fetch2 import FetchMethod
|
||||
from bb.fetch2 import FetchError
|
||||
from bb.fetch2 import encodeurl
|
||||
from bb.fetch2 import decodeurl
|
||||
from bb.fetch2 import logger
|
||||
from bb.fetch2 import runfetchcmd
|
||||
|
||||
@@ -63,20 +65,21 @@ class Wget(FetchMethod):
|
||||
|
||||
basecmd = d.getVar("FETCHCMD_wget", True) or "/usr/bin/env wget -t 2 -T 30 -nv --passive-ftp --no-check-certificate"
|
||||
|
||||
if not checkonly and 'downloadfilename' in ud.parm:
|
||||
dldir = d.getVar("DL_DIR", True)
|
||||
bb.utils.mkdirhier(os.path.dirname(dldir + os.sep + ud.localfile))
|
||||
basecmd += " -O " + dldir + os.sep + ud.localfile
|
||||
if 'downloadfilename' in ud.parm:
|
||||
basecmd += " -O ${DL_DIR}/" + ud.localfile
|
||||
|
||||
if checkonly:
|
||||
fetchcmd = d.getVar("CHECKCOMMAND_wget", True) or d.expand(basecmd + " --spider '${URI}'")
|
||||
fetchcmd = d.getVar("CHECKCOMMAND_wget", True) or d.expand(basecmd + " -c -P ${DL_DIR} '${URI}'")
|
||||
elif os.path.exists(ud.localpath):
|
||||
# file exists, but we didnt complete it.. trying again..
|
||||
fetchcmd = d.getVar("RESUMECOMMAND_wget", True) or d.expand(basecmd + " -c -P ${DL_DIR} '${URI}'")
|
||||
fetchcmd = d.getVar("RESUMECOMMAND_wget", True) or d.expand(basecmd + " --spider -P ${DL_DIR} '${URI}'")
|
||||
else:
|
||||
fetchcmd = d.getVar("FETCHCOMMAND_wget", True) or d.expand(basecmd + " -P ${DL_DIR} '${URI}'")
|
||||
|
||||
uri = uri.split(";")[0]
|
||||
uri_decoded = list(decodeurl(uri))
|
||||
uri_type = uri_decoded[0]
|
||||
uri_host = uri_decoded[1]
|
||||
|
||||
fetchcmd = fetchcmd.replace("${URI}", uri.split(";")[0])
|
||||
fetchcmd = fetchcmd.replace("${FILE}", ud.basename)
|
||||
|
||||
@@ -17,7 +17,24 @@
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
|
||||
"""
|
||||
What is a method pool?
|
||||
|
||||
BitBake has a global method scope where .bb, .inc and .bbclass
|
||||
files can install methods. These methods are parsed from strings.
|
||||
To avoid recompiling and executing these string we introduce
|
||||
a method pool to do this task.
|
||||
|
||||
This pool will be used to compile and execute the functions. It
|
||||
will be smart enough to
|
||||
"""
|
||||
|
||||
from bb.utils import better_compile, better_exec
|
||||
from bb import error
|
||||
|
||||
# A dict of function names we have seen
|
||||
_parsed_fns = { }
|
||||
|
||||
def insert_method(modulename, code, fn):
|
||||
"""
|
||||
@@ -26,3 +43,29 @@ def insert_method(modulename, code, fn):
|
||||
"""
|
||||
comp = better_compile(code, modulename, fn )
|
||||
better_exec(comp, None, code, fn)
|
||||
|
||||
# now some instrumentation
|
||||
code = comp.co_names
|
||||
for name in code:
|
||||
if name in ['None', 'False']:
|
||||
continue
|
||||
elif name in _parsed_fns and not _parsed_fns[name] == modulename:
|
||||
error("The function %s defined in %s was already declared in %s. BitBake has a global python function namespace so shared functions should be declared in a common include file rather than being duplicated, or if the functions are different, please use different function names." % (name, modulename, _parsed_fns[name]))
|
||||
else:
|
||||
_parsed_fns[name] = modulename
|
||||
|
||||
# A dict of modules the parser has finished with
|
||||
_parsed_methods = {}
|
||||
|
||||
def parsed_module(modulename):
|
||||
"""
|
||||
Has module been parsed?
|
||||
"""
|
||||
return modulename in _parsed_methods
|
||||
|
||||
def set_parsed_module(modulename):
|
||||
"""
|
||||
Set module as parsed
|
||||
"""
|
||||
_parsed_methods[modulename] = True
|
||||
|
||||
|
||||
@@ -107,7 +107,7 @@ def getDiskData(BBDirs, configuration):
|
||||
printErr("Invalid disk space value in BB_DISKMON_DIRS: %s" % pathSpaceInodeRe.group(3))
|
||||
return None
|
||||
else:
|
||||
# None means that it is not specified
|
||||
# 0 means that it is not specified
|
||||
minSpace = None
|
||||
|
||||
minInode = pathSpaceInodeRe.group(4)
|
||||
@@ -117,7 +117,7 @@ def getDiskData(BBDirs, configuration):
|
||||
printErr("Invalid inode value in BB_DISKMON_DIRS: %s" % pathSpaceInodeRe.group(4))
|
||||
return None
|
||||
else:
|
||||
# None means that it is not specified
|
||||
# 0 means that it is not specified
|
||||
minInode = None
|
||||
|
||||
if minSpace is None and minInode is None:
|
||||
@@ -127,9 +127,8 @@ def getDiskData(BBDirs, configuration):
|
||||
# DL_DIR may not exist at the very beginning
|
||||
if not os.path.exists(path):
|
||||
bb.utils.mkdirhier(path)
|
||||
dev = getMountedDev(path)
|
||||
# Use path/action as the key
|
||||
devDict[os.path.join(path, action)] = [dev, minSpace, minInode]
|
||||
mountedDev = getMountedDev(path)
|
||||
devDict[mountedDev] = action, path, minSpace, minInode
|
||||
|
||||
return devDict
|
||||
|
||||
@@ -193,10 +192,10 @@ class diskMonitor:
|
||||
# This is for STOPTASKS and ABORT, to avoid print the message repeatly
|
||||
# during waiting the tasks to finish
|
||||
self.checked = {}
|
||||
for k in self.devDict:
|
||||
self.preFreeS[k] = 0
|
||||
self.preFreeI[k] = 0
|
||||
self.checked[k] = False
|
||||
for dev in self.devDict:
|
||||
self.preFreeS[dev] = 0
|
||||
self.preFreeI[dev] = 0
|
||||
self.checked[dev] = False
|
||||
if self.spaceInterval is None and self.inodeInterval is None:
|
||||
self.enableMonitor = False
|
||||
|
||||
@@ -205,61 +204,46 @@ class diskMonitor:
|
||||
""" Take action for the monitor """
|
||||
|
||||
if self.enableMonitor:
|
||||
for k in self.devDict:
|
||||
path = os.path.dirname(k)
|
||||
action = os.path.basename(k)
|
||||
dev = self.devDict[k][0]
|
||||
minSpace = self.devDict[k][1]
|
||||
minInode = self.devDict[k][2]
|
||||
|
||||
st = os.statvfs(path)
|
||||
for dev in self.devDict:
|
||||
st = os.statvfs(self.devDict[dev][1])
|
||||
|
||||
# The free space, float point number
|
||||
freeSpace = st.f_bavail * st.f_frsize
|
||||
|
||||
if minSpace and freeSpace < minSpace:
|
||||
if self.devDict[dev][2] and freeSpace < self.devDict[dev][2]:
|
||||
# Always show warning, the self.checked would always be False if the action is WARN
|
||||
if self.preFreeS[k] == 0 or self.preFreeS[k] - freeSpace > self.spaceInterval and not self.checked[k]:
|
||||
logger.warn("The free space of %s (%s) is running low (%.3fGB left)" % \
|
||||
(path, dev, freeSpace / 1024 / 1024 / 1024.0))
|
||||
self.preFreeS[k] = freeSpace
|
||||
if self.preFreeS[dev] == 0 or self.preFreeS[dev] - freeSpace > self.spaceInterval and not self.checked[dev]:
|
||||
logger.warn("The free space of %s is running low (%.3fGB left)" % (dev, freeSpace / 1024 / 1024 / 1024.0))
|
||||
self.preFreeS[dev] = freeSpace
|
||||
|
||||
if action == "STOPTASKS" and not self.checked[k]:
|
||||
if self.devDict[dev][0] == "STOPTASKS" and not self.checked[dev]:
|
||||
logger.error("No new tasks can be excuted since the disk space monitor action is \"STOPTASKS\"!")
|
||||
self.checked[k] = True
|
||||
self.checked[dev] = True
|
||||
rq.finish_runqueue(False)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, path), self.configuration)
|
||||
elif action == "ABORT" and not self.checked[k]:
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, self.devDict[dev][1]), self.configuration)
|
||||
elif self.devDict[dev][0] == "ABORT" and not self.checked[dev]:
|
||||
logger.error("Immediately abort since the disk space monitor action is \"ABORT\"!")
|
||||
self.checked[k] = True
|
||||
self.checked[dev] = True
|
||||
rq.finish_runqueue(True)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, path), self.configuration)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, self.devDict[dev][1]), self.configuration)
|
||||
|
||||
# The free inodes, float point number
|
||||
freeInode = st.f_favail
|
||||
|
||||
if minInode and freeInode < minInode:
|
||||
# Some fs formats' (e.g., btrfs) statvfs.f_files (inodes) is
|
||||
# zero, this is a feature of the fs, we disable the inode
|
||||
# checking for such a fs.
|
||||
if st.f_files == 0:
|
||||
logger.warn("Inode check for %s is unavaliable, will remove it from disk monitor" % path)
|
||||
self.devDict[k][2] = None
|
||||
continue
|
||||
if self.devDict[dev][3] and freeInode < self.devDict[dev][3]:
|
||||
# Always show warning, the self.checked would always be False if the action is WARN
|
||||
if self.preFreeI[k] == 0 or self.preFreeI[k] - freeInode > self.inodeInterval and not self.checked[k]:
|
||||
logger.warn("The free inode of %s (%s) is running low (%.3fK left)" % \
|
||||
(path, dev, freeInode / 1024.0))
|
||||
self.preFreeI[k] = freeInode
|
||||
if self.preFreeI[dev] == 0 or self.preFreeI[dev] - freeInode > self.inodeInterval and not self.checked[dev]:
|
||||
logger.warn("The free inode of %s is running low (%.3fK left)" % (dev, freeInode / 1024.0))
|
||||
self.preFreeI[dev] = freeInode
|
||||
|
||||
if action == "STOPTASKS" and not self.checked[k]:
|
||||
if self.devDict[dev][0] == "STOPTASKS" and not self.checked[dev]:
|
||||
logger.error("No new tasks can be excuted since the disk space monitor action is \"STOPTASKS\"!")
|
||||
self.checked[k] = True
|
||||
self.checked[dev] = True
|
||||
rq.finish_runqueue(False)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'inode', freeInode, path), self.configuration)
|
||||
elif action == "ABORT" and not self.checked[k]:
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'inode', freeSpace, self.devDict[dev][1]), self.configuration)
|
||||
elif self.devDict[dev][0] == "ABORT" and not self.checked[dev]:
|
||||
logger.error("Immediately abort since the disk space monitor action is \"ABORT\"!")
|
||||
self.checked[k] = True
|
||||
self.checked[dev] = True
|
||||
rq.finish_runqueue(True)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'inode', freeInode, path), self.configuration)
|
||||
bb.event.fire(bb.event.DiskFull(dev, 'inode', freeSpace, self.devDict[dev][1]), self.configuration)
|
||||
return
|
||||
|
||||
@@ -23,7 +23,6 @@ Message handling infrastructure for bitbake
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import sys
|
||||
import copy
|
||||
import logging
|
||||
import collections
|
||||
from itertools import groupby
|
||||
@@ -56,25 +55,6 @@ class BBLogFormatter(logging.Formatter):
|
||||
CRITICAL: 'ERROR',
|
||||
}
|
||||
|
||||
color_enabled = False
|
||||
BASECOLOR, BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(29,38)
|
||||
|
||||
COLORS = {
|
||||
DEBUG3 : CYAN,
|
||||
DEBUG2 : CYAN,
|
||||
DEBUG : CYAN,
|
||||
VERBOSE : BASECOLOR,
|
||||
NOTE : BASECOLOR,
|
||||
PLAIN : BASECOLOR,
|
||||
WARNING : YELLOW,
|
||||
ERROR : RED,
|
||||
CRITICAL: RED,
|
||||
}
|
||||
|
||||
BLD = '\033[1;%dm'
|
||||
STD = '\033[%dm'
|
||||
RST = '\033[0m'
|
||||
|
||||
def getLevelName(self, levelno):
|
||||
try:
|
||||
return self.levelnames[levelno]
|
||||
@@ -87,8 +67,6 @@ class BBLogFormatter(logging.Formatter):
|
||||
if record.levelno == self.PLAIN:
|
||||
msg = record.getMessage()
|
||||
else:
|
||||
if self.color_enabled:
|
||||
record = self.colorize(record)
|
||||
msg = logging.Formatter.format(self, record)
|
||||
|
||||
if hasattr(record, 'bb_exc_info'):
|
||||
@@ -97,17 +75,6 @@ class BBLogFormatter(logging.Formatter):
|
||||
msg += '\n' + ''.join(formatted)
|
||||
return msg
|
||||
|
||||
def colorize(self, record):
|
||||
color = self.COLORS[record.levelno]
|
||||
if self.color_enabled and color is not None:
|
||||
record = copy.copy(record)
|
||||
record.levelname = "".join([self.BLD % color, record.levelname, self.RST])
|
||||
record.msg = "".join([self.STD % color, record.msg, self.RST])
|
||||
return record
|
||||
|
||||
def enable_color(self):
|
||||
self.color_enabled = True
|
||||
|
||||
class BBLogFilter(object):
|
||||
def __init__(self, handler, level, debug_domains):
|
||||
self.stdlevel = level
|
||||
|
||||
@@ -73,7 +73,8 @@ def update_mtime(f):
|
||||
def mark_dependency(d, f):
|
||||
if f.startswith('./'):
|
||||
f = "%s/%s" % (os.getcwd(), f[2:])
|
||||
deps = (d.getVar('__depends') or []) + [(f, cached_mtime(f))]
|
||||
deps = d.getVar('__depends') or set()
|
||||
deps.update([(f, cached_mtime(f))])
|
||||
d.setVar('__depends', deps)
|
||||
|
||||
def supports(fn, data):
|
||||
@@ -87,8 +88,7 @@ def handle(fn, data, include = 0):
|
||||
"""Call the handler that is appropriate for this file"""
|
||||
for h in handlers:
|
||||
if h['supports'](fn, data):
|
||||
with data.inchistory.include(fn):
|
||||
return h['handle'](fn, data, include)
|
||||
return h['handle'](fn, data, include)
|
||||
raise ParseError("not a BitBake file", fn)
|
||||
|
||||
def init(fn, data):
|
||||
@@ -134,8 +134,8 @@ def vars_from_file(mypkg, d):
|
||||
def get_file_depends(d):
|
||||
'''Return the dependent files'''
|
||||
dep_files = []
|
||||
depends = d.getVar('__base_depends', True) or []
|
||||
depends = depends + (d.getVar('__depends', True) or [])
|
||||
depends = d.getVar('__depends', True) or set()
|
||||
depends = depends.union(d.getVar('__base_depends', True) or set())
|
||||
for (fn, _) in depends:
|
||||
dep_files.append(os.path.abspath(fn))
|
||||
return " ".join(dep_files)
|
||||
|
||||
@@ -68,7 +68,7 @@ class ExportNode(AstNode):
|
||||
self.var = var
|
||||
|
||||
def eval(self, data):
|
||||
data.setVarFlag(self.var, "export", 1, op = 'exported')
|
||||
data.setVarFlag(self.var, "export", 1)
|
||||
|
||||
class DataNode(AstNode):
|
||||
"""
|
||||
@@ -90,53 +90,33 @@ class DataNode(AstNode):
|
||||
def eval(self, data):
|
||||
groupd = self.groupd
|
||||
key = groupd["var"]
|
||||
loginfo = {
|
||||
'variable': key,
|
||||
'file': self.filename,
|
||||
'line': self.lineno,
|
||||
}
|
||||
if "exp" in groupd and groupd["exp"] != None:
|
||||
data.setVarFlag(key, "export", 1, op = 'exported', **loginfo)
|
||||
|
||||
op = "set"
|
||||
data.setVarFlag(key, "export", 1)
|
||||
if "ques" in groupd and groupd["ques"] != None:
|
||||
val = self.getFunc(key, data)
|
||||
op = "set?"
|
||||
if val == None:
|
||||
val = groupd["value"]
|
||||
elif "colon" in groupd and groupd["colon"] != None:
|
||||
e = data.createCopy()
|
||||
bb.data.update_data(e)
|
||||
op = "immediate"
|
||||
val = e.expand(groupd["value"], key + "[:=]")
|
||||
elif "append" in groupd and groupd["append"] != None:
|
||||
op = "append"
|
||||
val = "%s %s" % ((self.getFunc(key, data) or ""), groupd["value"])
|
||||
elif "prepend" in groupd and groupd["prepend"] != None:
|
||||
op = "prepend"
|
||||
val = "%s %s" % (groupd["value"], (self.getFunc(key, data) or ""))
|
||||
elif "postdot" in groupd and groupd["postdot"] != None:
|
||||
op = "postdot"
|
||||
val = "%s%s" % ((self.getFunc(key, data) or ""), groupd["value"])
|
||||
elif "predot" in groupd and groupd["predot"] != None:
|
||||
op = "predot"
|
||||
val = "%s%s" % (groupd["value"], (self.getFunc(key, data) or ""))
|
||||
else:
|
||||
val = groupd["value"]
|
||||
|
||||
flag = None
|
||||
if 'flag' in groupd and groupd['flag'] != None:
|
||||
flag = groupd['flag']
|
||||
data.setVarFlag(key, groupd['flag'], val)
|
||||
elif groupd["lazyques"]:
|
||||
flag = "defaultval"
|
||||
|
||||
loginfo['op'] = op
|
||||
loginfo['detail'] = groupd["value"]
|
||||
|
||||
if flag:
|
||||
data.setVarFlag(key, flag, val, **loginfo)
|
||||
data.setVarFlag(key, "defaultval", val)
|
||||
else:
|
||||
data.setVar(key, val, **loginfo)
|
||||
data.setVar(key, val)
|
||||
|
||||
class MethodNode(AstNode):
|
||||
def __init__(self, filename, lineno, func_name, body):
|
||||
@@ -148,8 +128,9 @@ class MethodNode(AstNode):
|
||||
text = '\n'.join(self.body)
|
||||
if self.func_name == "__anonymous":
|
||||
funcname = ("__anon_%s_%s" % (self.lineno, self.filename.translate(string.maketrans('/.+-', '____'))))
|
||||
text = "def %s(d):\n" % (funcname) + text
|
||||
bb.methodpool.insert_method(funcname, text, self.filename)
|
||||
if not funcname in bb.methodpool._parsed_fns:
|
||||
text = "def %s(d):\n" % (funcname) + text
|
||||
bb.methodpool.insert_method(funcname, text, self.filename)
|
||||
anonfuncs = data.getVar('__BBANONFUNCS') or []
|
||||
anonfuncs.append(funcname)
|
||||
data.setVar('__BBANONFUNCS', anonfuncs)
|
||||
@@ -170,7 +151,8 @@ class PythonMethodNode(AstNode):
|
||||
# 'this' file. This means we will not parse methods from
|
||||
# bb classes twice
|
||||
text = '\n'.join(self.body)
|
||||
bb.methodpool.insert_method(self.modulename, text, self.filename)
|
||||
if not bb.methodpool.parsed_module(self.modulename):
|
||||
bb.methodpool.insert_method(self.modulename, text, self.filename)
|
||||
data.setVarFlag(self.function, "func", 1)
|
||||
data.setVarFlag(self.function, "python", 1)
|
||||
data.setVar(self.function, text)
|
||||
@@ -197,35 +179,44 @@ class MethodFlagsNode(AstNode):
|
||||
data.delVarFlag(self.key, "fakeroot")
|
||||
|
||||
class ExportFuncsNode(AstNode):
|
||||
def __init__(self, filename, lineno, fns, classname):
|
||||
def __init__(self, filename, lineno, fns, classes):
|
||||
AstNode.__init__(self, filename, lineno)
|
||||
self.n = fns.split()
|
||||
self.classname = classname
|
||||
self.classes = classes
|
||||
|
||||
def eval(self, data):
|
||||
for f in self.n:
|
||||
allvars = []
|
||||
allvars.append(f)
|
||||
allvars.append(self.classes[-1] + "_" + f)
|
||||
|
||||
for func in self.n:
|
||||
calledfunc = self.classname + "_" + func
|
||||
vars = [[ allvars[0], allvars[1] ]]
|
||||
if len(self.classes) > 1 and self.classes[-2] is not None:
|
||||
allvars.append(self.classes[-2] + "_" + f)
|
||||
vars = []
|
||||
vars.append([allvars[2], allvars[1]])
|
||||
vars.append([allvars[0], allvars[2]])
|
||||
|
||||
if data.getVar(func) and not data.getVarFlag(func, 'export_func'):
|
||||
continue
|
||||
for (var, calledvar) in vars:
|
||||
if data.getVar(var) and not data.getVarFlag(var, 'export_func'):
|
||||
continue
|
||||
|
||||
if data.getVar(func):
|
||||
data.setVarFlag(func, 'python', None)
|
||||
data.setVarFlag(func, 'func', None)
|
||||
if data.getVar(var):
|
||||
data.setVarFlag(var, 'python', None)
|
||||
data.setVarFlag(var, 'func', None)
|
||||
|
||||
for flag in [ "func", "python" ]:
|
||||
if data.getVarFlag(calledfunc, flag):
|
||||
data.setVarFlag(func, flag, data.getVarFlag(calledfunc, flag))
|
||||
for flag in [ "dirs" ]:
|
||||
if data.getVarFlag(func, flag):
|
||||
data.setVarFlag(calledfunc, flag, data.getVarFlag(func, flag))
|
||||
for flag in [ "func", "python" ]:
|
||||
if data.getVarFlag(calledvar, flag):
|
||||
data.setVarFlag(var, flag, data.getVarFlag(calledvar, flag))
|
||||
for flag in [ "dirs" ]:
|
||||
if data.getVarFlag(var, flag):
|
||||
data.setVarFlag(calledvar, flag, data.getVarFlag(var, flag))
|
||||
|
||||
if data.getVarFlag(calledfunc, "python"):
|
||||
data.setVar(func, " bb.build.exec_func('" + calledfunc + "', d)\n")
|
||||
else:
|
||||
data.setVar(func, " " + calledfunc + "\n")
|
||||
data.setVarFlag(func, 'export_func', '1')
|
||||
if data.getVarFlag(calledvar, "python"):
|
||||
data.setVar(var, " bb.build.exec_func('" + calledvar + "', d)\n")
|
||||
else:
|
||||
data.setVar(var, " " + calledvar + "\n")
|
||||
data.setVarFlag(var, 'export_func', '1')
|
||||
|
||||
class AddTaskNode(AstNode):
|
||||
def __init__(self, filename, lineno, func, before, after):
|
||||
@@ -297,8 +288,8 @@ def handlePythonMethod(statements, filename, lineno, funcname, modulename, body)
|
||||
def handleMethodFlags(statements, filename, lineno, key, m):
|
||||
statements.append(MethodFlagsNode(filename, lineno, key, m))
|
||||
|
||||
def handleExportFuncs(statements, filename, lineno, m, classname):
|
||||
statements.append(ExportFuncsNode(filename, lineno, m.group(1), classname))
|
||||
def handleExportFuncs(statements, filename, lineno, m, classes):
|
||||
statements.append(ExportFuncsNode(filename, lineno, m.group(1), classes))
|
||||
|
||||
def handleAddTask(statements, filename, lineno, m):
|
||||
func = m.group("func")
|
||||
|
||||
@@ -51,6 +51,7 @@ __infunc__ = ""
|
||||
__inpython__ = False
|
||||
__body__ = []
|
||||
__classname__ = ""
|
||||
classes = [ None, ]
|
||||
|
||||
cached_statements = {}
|
||||
|
||||
@@ -74,17 +75,10 @@ def inherit(files, fn, lineno, d):
|
||||
if not os.path.isabs(file) and not file.endswith(".bbclass"):
|
||||
file = os.path.join('classes', '%s.bbclass' % file)
|
||||
|
||||
if not os.path.isabs(file):
|
||||
dname = os.path.dirname(fn)
|
||||
bbpath = "%s:%s" % (dname, d.getVar("BBPATH", True))
|
||||
abs_fn = bb.utils.which(bbpath, file)
|
||||
if abs_fn:
|
||||
file = abs_fn
|
||||
|
||||
if not file in __inherit_cache:
|
||||
logger.log(logging.DEBUG -1, "BB %s:%d: inheriting %s", fn, lineno, file)
|
||||
__inherit_cache.append( file )
|
||||
d.setVar('__inherit_cache', __inherit_cache)
|
||||
data.setVar('__inherit_cache', __inherit_cache, d)
|
||||
include(fn, file, lineno, d, "inherit")
|
||||
__inherit_cache = d.getVar('__inherit_cache') or []
|
||||
|
||||
@@ -113,7 +107,7 @@ def get_statements(filename, absolute_filename, base_name):
|
||||
return statements
|
||||
|
||||
def handle(fn, d, include):
|
||||
global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __infunc__, __body__, __residue__, __classname__
|
||||
global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __infunc__, __body__, __residue__
|
||||
__body__ = []
|
||||
__infunc__ = ""
|
||||
__classname__ = ""
|
||||
@@ -131,10 +125,11 @@ def handle(fn, d, include):
|
||||
|
||||
if ext == ".bbclass":
|
||||
__classname__ = root
|
||||
classes.append(__classname__)
|
||||
__inherit_cache = d.getVar('__inherit_cache') or []
|
||||
if not fn in __inherit_cache:
|
||||
__inherit_cache.append(fn)
|
||||
d.setVar('__inherit_cache', __inherit_cache)
|
||||
data.setVar('__inherit_cache', __inherit_cache, d)
|
||||
|
||||
if include != 0:
|
||||
oldfile = d.getVar('FILE')
|
||||
@@ -151,25 +146,27 @@ def handle(fn, d, include):
|
||||
|
||||
# DONE WITH PARSING... time to evaluate
|
||||
if ext != ".bbclass":
|
||||
d.setVar('FILE', abs_fn)
|
||||
data.setVar('FILE', abs_fn, d)
|
||||
|
||||
try:
|
||||
statements.eval(d)
|
||||
except bb.parse.SkipPackage:
|
||||
bb.data.setVar("__SKIPPED", True, d)
|
||||
statements.eval(d)
|
||||
|
||||
if ext == ".bbclass":
|
||||
classes.remove(__classname__)
|
||||
else:
|
||||
if include == 0:
|
||||
return { "" : d }
|
||||
|
||||
if ext != ".bbclass" and include == 0:
|
||||
return ast.multi_finalize(fn, d)
|
||||
return ast.multi_finalize(fn, d)
|
||||
|
||||
if oldfile:
|
||||
d.setVar("FILE", oldfile)
|
||||
|
||||
# we have parsed the bb class now
|
||||
if ext == ".bbclass" or ext == ".inc":
|
||||
bb.methodpool.set_parsed_module(base_name)
|
||||
|
||||
return d
|
||||
|
||||
def feeder(lineno, s, fn, root, statements):
|
||||
global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __def_regexp__, __python_func_regexp__, __inpython__, __infunc__, __body__, bb, __residue__, __classname__
|
||||
global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __def_regexp__, __python_func_regexp__, __inpython__, __infunc__, __body__, classes, bb, __residue__
|
||||
if __infunc__:
|
||||
if s == '}':
|
||||
__body__.append('')
|
||||
@@ -196,10 +193,7 @@ def feeder(lineno, s, fn, root, statements):
|
||||
|
||||
if s and s[0] == '#':
|
||||
if len(__residue__) != 0 and __residue__[0][0] != "#":
|
||||
bb.fatal("There is a comment on line %s of file %s (%s) which is in the middle of a multiline expression.\nBitbake used to ignore these but no longer does so, please fix your metadata as errors are likely as a result of this change." % (lineno, fn, s))
|
||||
|
||||
if len(__residue__) != 0 and __residue__[0][0] == "#" and (not s or s[0] != "#"):
|
||||
bb.fatal("There is a confusing multiline, partially commented expression on line %s of file %s (%s).\nPlease clarify whether this is all a comment or should be parsed." % (lineno, fn, s))
|
||||
bb.error("There is a comment on line %s of file %s (%s) which is in the middle of a multiline expression.\nBitbake used to ignore these but no longer does so, please fix your metadata as errors are likely as a result of this change." % (lineno, fn, s))
|
||||
|
||||
if s and s[-1] == '\\':
|
||||
__residue__.append(s[:-1])
|
||||
@@ -231,7 +225,7 @@ def feeder(lineno, s, fn, root, statements):
|
||||
|
||||
m = __export_func_regexp__.match(s)
|
||||
if m:
|
||||
ast.handleExportFuncs(statements, fn, lineno, m, __classname__)
|
||||
ast.handleExportFuncs(statements, fn, lineno, m, classes)
|
||||
return
|
||||
|
||||
m = __addtask_regexp__.match(s)
|
||||
|
||||
@@ -29,30 +29,7 @@ import logging
|
||||
import bb.utils
|
||||
from bb.parse import ParseError, resolve_file, ast, logger
|
||||
|
||||
__config_regexp__ = re.compile( r"""
|
||||
^
|
||||
(?P<exp>export\s*)?
|
||||
(?P<var>[a-zA-Z0-9\-~_+.${}/]+?)
|
||||
(\[(?P<flag>[a-zA-Z0-9\-_+.]+)\])?
|
||||
|
||||
\s* (
|
||||
(?P<colon>:=) |
|
||||
(?P<lazyques>\?\?=) |
|
||||
(?P<ques>\?=) |
|
||||
(?P<append>\+=) |
|
||||
(?P<prepend>=\+) |
|
||||
(?P<predot>=\.) |
|
||||
(?P<postdot>\.=) |
|
||||
=
|
||||
) \s*
|
||||
|
||||
(?!'[^']*'[^']*'$)
|
||||
(?!\"[^\"]*\"[^\"]*\"$)
|
||||
(?P<apo>['\"])
|
||||
(?P<value>.*)
|
||||
(?P=apo)
|
||||
$
|
||||
""", re.X)
|
||||
__config_regexp__ = re.compile( r"(?P<exp>export\s*)?(?P<var>[a-zA-Z0-9\-_+.${}/]+)(\[(?P<flag>[a-zA-Z0-9\-_+.]+)\])?\s*((?P<colon>:=)|(?P<lazyques>\?\?=)|(?P<ques>\?=)|(?P<append>\+=)|(?P<prepend>=\+)|(?P<predot>=\.)|(?P<postdot>\.=)|=)\s*(?!'[^']*'[^']*'$)(?!\"[^\"]*\"[^\"]*\"$)(?P<apo>['\"])(?P<value>.*)(?P=apo)$")
|
||||
__include_regexp__ = re.compile( r"include\s+(.+)" )
|
||||
__require_regexp__ = re.compile( r"require\s+(.+)" )
|
||||
__export_regexp__ = re.compile( r"export\s+([a-zA-Z0-9\-_+.${}/]+)$" )
|
||||
@@ -121,22 +98,15 @@ def handle(fn, data, include):
|
||||
while True:
|
||||
lineno = lineno + 1
|
||||
s = f.readline()
|
||||
if not s:
|
||||
break
|
||||
if not s: break
|
||||
w = s.strip()
|
||||
# skip empty lines
|
||||
if not w:
|
||||
continue
|
||||
if not w: continue # skip empty lines
|
||||
s = s.rstrip()
|
||||
if s[0] == '#': continue # skip comments
|
||||
while s[-1] == '\\':
|
||||
s2 = f.readline().strip()
|
||||
lineno = lineno + 1
|
||||
if (not s2 or s2 and s2[0] != "#") and s[0] == "#" :
|
||||
bb.fatal("There is a confusing multiline, partially commented expression on line %s of file %s (%s).\nPlease clarify whether this is all a comment or should be parsed." % (lineno, fn, s))
|
||||
s = s[:-1] + s2
|
||||
# skip comments
|
||||
if s[0] == '#':
|
||||
continue
|
||||
feeder(lineno, s, fn, statements)
|
||||
|
||||
# DONE WITH PARSING... time to evaluate
|
||||
|
||||
@@ -125,11 +125,6 @@ class SQLTable(collections.MutableMapping):
|
||||
|
||||
return len(self) < len(other)
|
||||
|
||||
def get_by_pattern(self, pattern):
|
||||
data = self._execute("SELECT * FROM %s WHERE key LIKE ?;" %
|
||||
self.table, [pattern])
|
||||
return [row[1] for row in data]
|
||||
|
||||
def values(self):
|
||||
return list(self.itervalues())
|
||||
|
||||
|
||||
@@ -472,7 +472,7 @@ class RunQueueData:
|
||||
if depdata is not None:
|
||||
taskid = taskData.gettask_id_fromfnid(depdata, idependtask)
|
||||
if taskid is None:
|
||||
bb.msg.fatal("RunQueue", "Task %s in %s depends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, taskData.fn_index[depdata]))
|
||||
bb.msg.fatal("RunQueue", "Task %s in %s depends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, dep))
|
||||
depends.add(taskid)
|
||||
irdepends = taskData.tasks_irdepends[task]
|
||||
for (depid, idependtask) in irdepends:
|
||||
@@ -482,7 +482,7 @@ class RunQueueData:
|
||||
if depdata is not None:
|
||||
taskid = taskData.gettask_id_fromfnid(depdata, idependtask)
|
||||
if taskid is None:
|
||||
bb.msg.fatal("RunQueue", "Task %s in %s rdepends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, taskData.fn_index[depdata]))
|
||||
bb.msg.fatal("RunQueue", "Task %s in %s rdepends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, dep))
|
||||
depends.add(taskid)
|
||||
|
||||
# Resolve recursive 'recrdeptask' dependencies (Part A)
|
||||
@@ -785,7 +785,6 @@ class RunQueue:
|
||||
self.stamppolicy = cfgData.getVar("BB_STAMP_POLICY", True) or "perfile"
|
||||
self.hashvalidate = cfgData.getVar("BB_HASHCHECK_FUNCTION", True) or None
|
||||
self.setsceneverify = cfgData.getVar("BB_SETSCENE_VERIFY_FUNCTION", True) or None
|
||||
self.depvalidate = cfgData.getVar("BB_SETSCENE_DEPVALID", True) or None
|
||||
|
||||
self.state = runQueuePrepare
|
||||
|
||||
@@ -1149,8 +1148,7 @@ class RunQueueExecute:
|
||||
os._exit(1)
|
||||
try:
|
||||
if not self.cooker.configuration.dry_run:
|
||||
profile = self.cooker.configuration.profile
|
||||
ret = bb.build.exec_task(fn, taskname, the_data, profile)
|
||||
ret = bb.build.exec_task(fn, taskname, the_data)
|
||||
os._exit(ret)
|
||||
except:
|
||||
os._exit(1)
|
||||
@@ -1163,26 +1161,6 @@ class RunQueueExecute:
|
||||
|
||||
return pid, pipein, pipeout
|
||||
|
||||
def check_dependencies(self, task, taskdeps, setscene = False):
|
||||
if not self.rq.depvalidate:
|
||||
return False
|
||||
|
||||
taskdata = {}
|
||||
taskdeps.add(task)
|
||||
for dep in taskdeps:
|
||||
if setscene:
|
||||
depid = self.rqdata.runq_setscene[dep]
|
||||
else:
|
||||
depid = dep
|
||||
fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[depid]]
|
||||
pn = self.rqdata.dataCache.pkg_fn[fn]
|
||||
taskname = self.rqdata.runq_task[depid]
|
||||
taskdata[dep] = [pn, taskname, fn]
|
||||
call = self.rq.depvalidate + "(task, taskdata, notneeded, d)"
|
||||
locs = { "task" : task, "taskdata" : taskdata, "notneeded" : self.scenequeue_notneeded, "d" : self.cooker.configuration.data }
|
||||
valid = bb.utils.better_eval(call, locs)
|
||||
return valid
|
||||
|
||||
class RunQueueExecuteDummy(RunQueueExecute):
|
||||
def __init__(self, rq):
|
||||
self.rq = rq
|
||||
@@ -1220,8 +1198,16 @@ class RunQueueExecuteTasks(RunQueueExecute):
|
||||
logger.debug(1, 'Considering %s (%s): %s' % (task, self.rqdata.get_user_idstring(task), str(self.rqdata.runq_revdeps[task])))
|
||||
|
||||
if len(self.rqdata.runq_revdeps[task]) > 0 and self.rqdata.runq_revdeps[task].issubset(self.rq.scenequeue_covered) and task not in self.rq.scenequeue_notcovered:
|
||||
found = True
|
||||
self.rq.scenequeue_covered.add(task)
|
||||
ok = True
|
||||
for revdep in self.rqdata.runq_revdeps[task]:
|
||||
if self.rqdata.runq_fnid[task] != self.rqdata.runq_fnid[revdep]:
|
||||
logger.debug(1, 'Found "bad" dep %s (%s) for %s (%s)' % (revdep, self.rqdata.get_user_idstring(revdep), task, self.rqdata.get_user_idstring(task)))
|
||||
|
||||
ok = False
|
||||
break
|
||||
if ok:
|
||||
found = True
|
||||
self.rq.scenequeue_covered.add(task)
|
||||
|
||||
logger.debug(1, 'Skip list (pre setsceneverify) %s', sorted(self.rq.scenequeue_covered))
|
||||
|
||||
@@ -1422,7 +1408,6 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
|
||||
self.scenequeue_covered = set()
|
||||
self.scenequeue_notcovered = set()
|
||||
self.scenequeue_notneeded = set()
|
||||
|
||||
# If we don't have any setscene functions, skip this step
|
||||
if len(self.rqdata.runq_setscene) == 0:
|
||||
@@ -1432,6 +1417,7 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
|
||||
self.stats = RunQueueStats(len(self.rqdata.runq_setscene))
|
||||
|
||||
endpoints = {}
|
||||
sq_revdeps = []
|
||||
sq_revdeps_new = []
|
||||
sq_revdeps_squash = []
|
||||
@@ -1446,15 +1432,12 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
self.runq_complete.append(0)
|
||||
self.runq_buildable.append(0)
|
||||
|
||||
# First process the chains up to the first setscene task.
|
||||
endpoints = {}
|
||||
for task in xrange(len(self.rqdata.runq_fnid)):
|
||||
sq_revdeps.append(copy.copy(self.rqdata.runq_revdeps[task]))
|
||||
sq_revdeps_new.append(set())
|
||||
if (len(self.rqdata.runq_revdeps[task]) == 0) and task not in self.rqdata.runq_setscene:
|
||||
endpoints[task] = set()
|
||||
|
||||
# Secondly process the chains between setscene tasks.
|
||||
for task in self.rqdata.runq_setscene:
|
||||
for dep in self.rqdata.runq_depends[task]:
|
||||
if dep not in endpoints:
|
||||
@@ -1470,8 +1453,6 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
if sq_revdeps_new[point]:
|
||||
tasks |= sq_revdeps_new[point]
|
||||
sq_revdeps_new[point] = set()
|
||||
if point in self.rqdata.runq_setscene:
|
||||
sq_revdeps_new[point] = tasks
|
||||
for dep in self.rqdata.runq_depends[point]:
|
||||
if point in sq_revdeps[dep]:
|
||||
sq_revdeps[dep].remove(point)
|
||||
@@ -1484,42 +1465,6 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
|
||||
process_endpoints(endpoints)
|
||||
|
||||
# Build a list of setscene tasks which as "unskippable"
|
||||
# These are direct endpoints referenced by the build
|
||||
endpoints2 = {}
|
||||
sq_revdeps2 = []
|
||||
sq_revdeps_new2 = []
|
||||
def process_endpoints2(endpoints):
|
||||
newendpoints = {}
|
||||
for point, task in endpoints.items():
|
||||
tasks = set([point])
|
||||
if task:
|
||||
tasks |= task
|
||||
if sq_revdeps_new2[point]:
|
||||
tasks |= sq_revdeps_new2[point]
|
||||
sq_revdeps_new2[point] = set()
|
||||
if point in self.rqdata.runq_setscene:
|
||||
sq_revdeps_new2[point] = tasks
|
||||
for dep in self.rqdata.runq_depends[point]:
|
||||
if point in sq_revdeps2[dep]:
|
||||
sq_revdeps2[dep].remove(point)
|
||||
if tasks:
|
||||
sq_revdeps_new2[dep] |= tasks
|
||||
if (len(sq_revdeps2[dep]) == 0 or len(sq_revdeps_new2[dep]) != 0) and dep not in self.rqdata.runq_setscene:
|
||||
newendpoints[dep] = tasks
|
||||
if len(newendpoints) != 0:
|
||||
process_endpoints2(newendpoints)
|
||||
for task in xrange(len(self.rqdata.runq_fnid)):
|
||||
sq_revdeps2.append(copy.copy(self.rqdata.runq_revdeps[task]))
|
||||
sq_revdeps_new2.append(set())
|
||||
if (len(self.rqdata.runq_revdeps[task]) == 0) and task not in self.rqdata.runq_setscene:
|
||||
endpoints2[task] = set()
|
||||
process_endpoints2(endpoints2)
|
||||
self.unskippable = []
|
||||
for task in self.rqdata.runq_setscene:
|
||||
if sq_revdeps_new2[task]:
|
||||
self.unskippable.append(self.rqdata.runq_setscene.index(task))
|
||||
|
||||
for task in xrange(len(self.rqdata.runq_fnid)):
|
||||
if task in self.rqdata.runq_setscene:
|
||||
deps = set()
|
||||
@@ -1680,13 +1625,6 @@ class RunQueueExecuteScenequeue(RunQueueExecute):
|
||||
# Find the next setscene to run
|
||||
for nexttask in xrange(self.stats.total):
|
||||
if self.runq_buildable[nexttask] == 1 and self.runq_running[nexttask] != 1:
|
||||
if nexttask in self.unskippable:
|
||||
logger.debug(2, "Setscene task %s is unskippable" % self.rqdata.get_user_idstring(self.rqdata.runq_setscene[nexttask]))
|
||||
if nexttask not in self.unskippable and len(self.sq_revdeps[nexttask]) > 0 and self.sq_revdeps[nexttask].issubset(self.scenequeue_covered) and self.check_dependencies(nexttask, self.sq_revdeps[nexttask], True):
|
||||
logger.debug(2, "Skipping setscene for task %s" % self.rqdata.get_user_idstring(self.rqdata.runq_setscene[nexttask]))
|
||||
self.task_skip(nexttask)
|
||||
self.scenequeue_notneeded.add(nexttask)
|
||||
return True
|
||||
task = nexttask
|
||||
break
|
||||
if task is not None:
|
||||
|
||||
@@ -191,8 +191,8 @@ class BitBakeServer(object):
|
||||
def saveConnectionDetails(self):
|
||||
return
|
||||
|
||||
def detach(self):
|
||||
return
|
||||
def detach(self, cooker_logfile):
|
||||
self.logfile = cooker_logfile
|
||||
|
||||
def establishConnection(self):
|
||||
self.connection = BitBakeServerConnection(self)
|
||||
|
||||
@@ -45,10 +45,10 @@ class ServerCommunicator():
|
||||
while True:
|
||||
# don't let the user ctrl-c while we're waiting for a response
|
||||
try:
|
||||
if self.connection.poll(20):
|
||||
if self.connection.poll(.5):
|
||||
return self.connection.recv()
|
||||
else:
|
||||
bb.fatal("Timeout while attempting to communicate with bitbake server")
|
||||
return None
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
|
||||
@@ -256,7 +256,7 @@ class BitBakeServer(object):
|
||||
def saveConnectionDetails(self):
|
||||
return
|
||||
|
||||
def detach(self):
|
||||
def detach(self, cooker_logfile):
|
||||
self.server.start()
|
||||
return
|
||||
|
||||
@@ -266,5 +266,5 @@ class BitBakeServer(object):
|
||||
return self.connection
|
||||
|
||||
def launchUI(self, uifunc, *args):
|
||||
return uifunc(*args)
|
||||
return bb.cooker.server_main(self.cooker, uifunc, *args)
|
||||
|
||||
|
||||
@@ -280,8 +280,8 @@ class BitBakeServer(object):
|
||||
def saveConnectionDetails(self):
|
||||
self.serverinfo = BitbakeServerInfo(self.server.host, self.server.port)
|
||||
|
||||
def detach(self):
|
||||
daemonize.createDaemon(self.server.serve_forever, "bitbake-cookerdaemon.log")
|
||||
def detach(self, cooker_logfile):
|
||||
daemonize.createDaemon(self.server.serve_forever, cooker_logfile)
|
||||
del self.cooker
|
||||
del self.server
|
||||
|
||||
|
||||
@@ -49,7 +49,7 @@ class SignatureGenerator(object):
|
||||
return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.')
|
||||
|
||||
def stampcleanmask(self, stampbase, file_name, taskname, extrainfo):
|
||||
return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.')
|
||||
return ("%s.%s*.%s" % (stampbase, taskname, extrainfo)).rstrip('.')
|
||||
|
||||
def dump_sigtask(self, fn, task, stampbase, runtime):
|
||||
return
|
||||
@@ -98,7 +98,6 @@ class SignatureGeneratorBasic(SignatureGenerator):
|
||||
bb.error("Task %s from %s seems to be empty?!" % (task, fn))
|
||||
data = ''
|
||||
|
||||
gendeps[task] -= self.basewhitelist
|
||||
newdeps = gendeps[task]
|
||||
seen = set()
|
||||
while newdeps:
|
||||
@@ -108,12 +107,12 @@ class SignatureGeneratorBasic(SignatureGenerator):
|
||||
for dep in nextdeps:
|
||||
if dep in self.basewhitelist:
|
||||
continue
|
||||
gendeps[dep] -= self.basewhitelist
|
||||
newdeps |= gendeps[dep]
|
||||
newdeps -= seen
|
||||
|
||||
alldeps = sorted(seen)
|
||||
for dep in alldeps:
|
||||
alldeps = seen - self.basewhitelist
|
||||
|
||||
for dep in sorted(alldeps):
|
||||
data = data + dep
|
||||
if dep in lookupcache:
|
||||
var = lookupcache[dep]
|
||||
@@ -127,7 +126,7 @@ class SignatureGeneratorBasic(SignatureGenerator):
|
||||
if var:
|
||||
data = data + str(var)
|
||||
self.basehash[fn + "." + task] = hashlib.md5(data).hexdigest()
|
||||
taskdeps[task] = alldeps
|
||||
taskdeps[task] = sorted(alldeps)
|
||||
|
||||
self.taskdeps[fn] = taskdeps
|
||||
self.gendeps[fn] = gendeps
|
||||
@@ -277,6 +276,7 @@ class SignatureGeneratorBasicHash(SignatureGeneratorBasic):
|
||||
k = fn + "." + taskname
|
||||
if clean:
|
||||
h = "*"
|
||||
taskname = taskname + "*"
|
||||
elif k in self.taskhash:
|
||||
h = self.taskhash[k]
|
||||
else:
|
||||
@@ -331,12 +331,12 @@ def compare_sigfiles(a, b, recursecb = None):
|
||||
return changed, added, removed
|
||||
|
||||
if 'basewhitelist' in a_data and a_data['basewhitelist'] != b_data['basewhitelist']:
|
||||
output.append("basewhitelist changed from '%s' to '%s'" % (a_data['basewhitelist'], b_data['basewhitelist']))
|
||||
output.append("basewhitelist changed from %s to %s" % (a_data['basewhitelist'], b_data['basewhitelist']))
|
||||
if a_data['basewhitelist'] and b_data['basewhitelist']:
|
||||
output.append("changed items: %s" % a_data['basewhitelist'].symmetric_difference(b_data['basewhitelist']))
|
||||
|
||||
if 'taskwhitelist' in a_data and a_data['taskwhitelist'] != b_data['taskwhitelist']:
|
||||
output.append("taskwhitelist changed from '%s' to '%s'" % (a_data['taskwhitelist'], b_data['taskwhitelist']))
|
||||
output.append("taskwhitelist changed from %s to %s" % (a_data['taskwhitelist'], b_data['taskwhitelist']))
|
||||
if a_data['taskwhitelist'] and b_data['taskwhitelist']:
|
||||
output.append("changed items: %s" % a_data['taskwhitelist'].symmetric_difference(b_data['taskwhitelist']))
|
||||
|
||||
@@ -349,7 +349,7 @@ def compare_sigfiles(a, b, recursecb = None):
|
||||
changed, added, removed = dict_diff(a_data['gendeps'], b_data['gendeps'], a_data['basewhitelist'] & b_data['basewhitelist'])
|
||||
if changed:
|
||||
for dep in changed:
|
||||
output.append("List of dependencies for variable %s changed from '%s' to '%s'" % (dep, a_data['gendeps'][dep], b_data['gendeps'][dep]))
|
||||
output.append("List of dependencies for variable %s changed from %s to %s" % (dep, a_data['gendeps'][dep], b_data['gendeps'][dep]))
|
||||
if a_data['gendeps'][dep] and b_data['gendeps'][dep]:
|
||||
output.append("changed items: %s" % a_data['gendeps'][dep].symmetric_difference(b_data['gendeps'][dep]))
|
||||
if added:
|
||||
@@ -363,7 +363,7 @@ def compare_sigfiles(a, b, recursecb = None):
|
||||
changed, added, removed = dict_diff(a_data['varvals'], b_data['varvals'])
|
||||
if changed:
|
||||
for dep in changed:
|
||||
output.append("Variable %s value changed from '%s' to '%s'" % (dep, a_data['varvals'][dep], b_data['varvals'][dep]))
|
||||
output.append("Variable %s value changed from %s to %s" % (dep, a_data['varvals'][dep], b_data['varvals'][dep]))
|
||||
|
||||
changed, added, removed = dict_diff(a_data['file_checksum_values'], b_data['file_checksum_values'])
|
||||
if changed:
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
#
|
||||
# BitBake Test for codeparser.py
|
||||
#
|
||||
@@ -26,9 +24,6 @@ import bb
|
||||
|
||||
logger = logging.getLogger('BitBake.TestCodeParser')
|
||||
|
||||
# bb.data references bb.parse but can't directly import due to circular dependencies.
|
||||
# Hack around it for now :(
|
||||
import bb.parse
|
||||
import bb.data
|
||||
|
||||
class ReferenceTest(unittest.TestCase):
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
#
|
||||
# BitBake Tests for Copy-on-Write (cow.py)
|
||||
#
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
#
|
||||
# BitBake Tests for the Data Store (data.py/data_smart.py)
|
||||
#
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
#
|
||||
# BitBake Tests for the Fetcher (fetch2/)
|
||||
#
|
||||
@@ -23,252 +21,11 @@ import unittest
|
||||
import tempfile
|
||||
import subprocess
|
||||
import os
|
||||
from bb.fetch2 import URI
|
||||
import bb
|
||||
|
||||
class URITest(unittest.TestCase):
|
||||
test_uris = {
|
||||
"http://www.google.com/index.html" : {
|
||||
'uri': 'http://www.google.com/index.html',
|
||||
'scheme': 'http',
|
||||
'hostname': 'www.google.com',
|
||||
'port': None,
|
||||
'hostport': 'www.google.com',
|
||||
'path': '/index.html',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': False
|
||||
},
|
||||
"http://www.google.com/index.html;param1=value1" : {
|
||||
'uri': 'http://www.google.com/index.html;param1=value1',
|
||||
'scheme': 'http',
|
||||
'hostname': 'www.google.com',
|
||||
'port': None,
|
||||
'hostport': 'www.google.com',
|
||||
'path': '/index.html',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {
|
||||
'param1': 'value1'
|
||||
},
|
||||
'relative': False
|
||||
},
|
||||
"http://www.example.com:8080/index.html" : {
|
||||
'uri': 'http://www.example.com:8080/index.html',
|
||||
'scheme': 'http',
|
||||
'hostname': 'www.example.com',
|
||||
'port': 8080,
|
||||
'hostport': 'www.example.com:8080',
|
||||
'path': '/index.html',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': False
|
||||
},
|
||||
"cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg" : {
|
||||
'uri': 'cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg',
|
||||
'scheme': 'cvs',
|
||||
'hostname': 'cvs.handhelds.org',
|
||||
'port': None,
|
||||
'hostport': 'cvs.handhelds.org',
|
||||
'path': '/cvs',
|
||||
'userinfo': 'anoncvs',
|
||||
'username': 'anoncvs',
|
||||
'password': '',
|
||||
'params': {
|
||||
'module': 'familiar/dist/ipkg'
|
||||
},
|
||||
'relative': False
|
||||
},
|
||||
"cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg": {
|
||||
'uri': 'cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg',
|
||||
'scheme': 'cvs',
|
||||
'hostname': 'cvs.handhelds.org',
|
||||
'port': None,
|
||||
'hostport': 'cvs.handhelds.org',
|
||||
'path': '/cvs',
|
||||
'userinfo': 'anoncvs:anonymous',
|
||||
'username': 'anoncvs',
|
||||
'password': 'anonymous',
|
||||
'params': {
|
||||
'tag': 'V0-99-81',
|
||||
'module': 'familiar/dist/ipkg'
|
||||
},
|
||||
'relative': False
|
||||
},
|
||||
"file://example.diff": { # NOTE: Not RFC compliant!
|
||||
'uri': 'file:example.diff',
|
||||
'scheme': 'file',
|
||||
'hostname': '',
|
||||
'port': None,
|
||||
'hostport': '',
|
||||
'path': 'example.diff',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': True
|
||||
},
|
||||
"file:example.diff": { # NOTE: RFC compliant version of the former
|
||||
'uri': 'file:example.diff',
|
||||
'scheme': 'file',
|
||||
'hostname': '',
|
||||
'port': None,
|
||||
'hostport': '',
|
||||
'path': 'example.diff',
|
||||
'userinfo': '',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': True
|
||||
},
|
||||
"file:///tmp/example.diff": {
|
||||
'uri': 'file:///tmp/example.diff',
|
||||
'scheme': 'file',
|
||||
'hostname': '',
|
||||
'port': None,
|
||||
'hostport': '',
|
||||
'path': '/tmp/example.diff',
|
||||
'userinfo': '',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': False
|
||||
},
|
||||
"git:///path/example.git": {
|
||||
'uri': 'git:///path/example.git',
|
||||
'scheme': 'git',
|
||||
'hostname': '',
|
||||
'port': None,
|
||||
'hostport': '',
|
||||
'path': '/path/example.git',
|
||||
'userinfo': '',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': False
|
||||
},
|
||||
"git:path/example.git": {
|
||||
'uri': 'git:path/example.git',
|
||||
'scheme': 'git',
|
||||
'hostname': '',
|
||||
'port': None,
|
||||
'hostport': '',
|
||||
'path': 'path/example.git',
|
||||
'userinfo': '',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': True
|
||||
},
|
||||
"git://example.net/path/example.git": {
|
||||
'uri': 'git://example.net/path/example.git',
|
||||
'scheme': 'git',
|
||||
'hostname': 'example.net',
|
||||
'port': None,
|
||||
'hostport': 'example.net',
|
||||
'path': '/path/example.git',
|
||||
'userinfo': '',
|
||||
'userinfo': '',
|
||||
'username': '',
|
||||
'password': '',
|
||||
'params': {},
|
||||
'relative': False
|
||||
}
|
||||
}
|
||||
|
||||
def test_uri(self):
|
||||
for test_uri, ref in self.test_uris.items():
|
||||
uri = URI(test_uri)
|
||||
|
||||
self.assertEqual(str(uri), ref['uri'])
|
||||
|
||||
# expected attributes
|
||||
self.assertEqual(uri.scheme, ref['scheme'])
|
||||
|
||||
self.assertEqual(uri.userinfo, ref['userinfo'])
|
||||
self.assertEqual(uri.username, ref['username'])
|
||||
self.assertEqual(uri.password, ref['password'])
|
||||
|
||||
self.assertEqual(uri.hostname, ref['hostname'])
|
||||
self.assertEqual(uri.port, ref['port'])
|
||||
self.assertEqual(uri.hostport, ref['hostport'])
|
||||
|
||||
self.assertEqual(uri.path, ref['path'])
|
||||
self.assertEqual(uri.params, ref['params'])
|
||||
|
||||
self.assertEqual(uri.relative, ref['relative'])
|
||||
|
||||
def test_dict(self):
|
||||
for test in self.test_uris.values():
|
||||
uri = URI()
|
||||
|
||||
self.assertEqual(uri.scheme, '')
|
||||
self.assertEqual(uri.userinfo, '')
|
||||
self.assertEqual(uri.username, '')
|
||||
self.assertEqual(uri.password, '')
|
||||
self.assertEqual(uri.hostname, '')
|
||||
self.assertEqual(uri.port, None)
|
||||
self.assertEqual(uri.path, '')
|
||||
self.assertEqual(uri.params, {})
|
||||
|
||||
|
||||
uri.scheme = test['scheme']
|
||||
self.assertEqual(uri.scheme, test['scheme'])
|
||||
|
||||
uri.userinfo = test['userinfo']
|
||||
self.assertEqual(uri.userinfo, test['userinfo'])
|
||||
self.assertEqual(uri.username, test['username'])
|
||||
self.assertEqual(uri.password, test['password'])
|
||||
|
||||
uri.hostname = test['hostname']
|
||||
self.assertEqual(uri.hostname, test['hostname'])
|
||||
self.assertEqual(uri.hostport, test['hostname'])
|
||||
|
||||
uri.port = test['port']
|
||||
self.assertEqual(uri.port, test['port'])
|
||||
self.assertEqual(uri.hostport, test['hostport'])
|
||||
|
||||
uri.path = test['path']
|
||||
self.assertEqual(uri.path, test['path'])
|
||||
|
||||
uri.params = test['params']
|
||||
self.assertEqual(uri.params, test['params'])
|
||||
|
||||
self.assertEqual(str(uri)+str(uri.relative), str(test['uri'])+str(test['relative']))
|
||||
|
||||
self.assertEqual(str(uri), test['uri'])
|
||||
|
||||
uri.params = {}
|
||||
self.assertEqual(uri.params, {})
|
||||
self.assertEqual(str(uri), (str(uri).split(";"))[0])
|
||||
|
||||
class FetcherTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.d = bb.data.init()
|
||||
self.tempdir = tempfile.mkdtemp()
|
||||
self.dldir = os.path.join(self.tempdir, "download")
|
||||
os.mkdir(self.dldir)
|
||||
self.d.setVar("DL_DIR", self.dldir)
|
||||
self.unpackdir = os.path.join(self.tempdir, "unpacked")
|
||||
os.mkdir(self.unpackdir)
|
||||
persistdir = os.path.join(self.tempdir, "persistdata")
|
||||
self.d.setVar("PERSISTENT_DIR", persistdir)
|
||||
|
||||
def tearDown(self):
|
||||
bb.utils.prunedir(self.tempdir)
|
||||
|
||||
class MirrorUriTest(FetcherTest):
|
||||
|
||||
replaceuris = {
|
||||
("git://git.invalid.infradead.org/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/.*", "http://somewhere.org/somedir/")
|
||||
: "http://somewhere.org/somedir/git2_git.invalid.infradead.org.mtd-utils.git.tar.gz",
|
||||
@@ -309,6 +66,87 @@ class MirrorUriTest(FetcherTest):
|
||||
"https://.*/.* file:///someotherpath/downloads/ \n" \
|
||||
"http://.*/.* file:///someotherpath/downloads/ \n"
|
||||
|
||||
def setUp(self):
|
||||
self.d = bb.data.init()
|
||||
self.tempdir = tempfile.mkdtemp()
|
||||
self.dldir = os.path.join(self.tempdir, "download")
|
||||
os.mkdir(self.dldir)
|
||||
self.d.setVar("DL_DIR", self.dldir)
|
||||
self.unpackdir = os.path.join(self.tempdir, "unpacked")
|
||||
os.mkdir(self.unpackdir)
|
||||
persistdir = os.path.join(self.tempdir, "persistdata")
|
||||
self.d.setVar("PERSISTENT_DIR", persistdir)
|
||||
|
||||
def tearDown(self):
|
||||
bb.utils.prunedir(self.tempdir)
|
||||
|
||||
def test_fetch(self):
|
||||
fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.1.tar.gz"), 57892)
|
||||
self.d.setVar("BB_NO_NETWORK", "1")
|
||||
fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
fetcher.unpack(self.unpackdir)
|
||||
self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.0/")), 9)
|
||||
self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.1/")), 9)
|
||||
|
||||
def test_fetch_mirror(self):
|
||||
self.d.setVar("MIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
|
||||
fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
|
||||
def test_fetch_premirror(self):
|
||||
self.d.setVar("PREMIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
|
||||
fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
|
||||
def gitfetcher(self, url1, url2):
|
||||
def checkrevision(self, fetcher):
|
||||
fetcher.unpack(self.unpackdir)
|
||||
revision = subprocess.check_output("git rev-parse HEAD", shell=True, cwd=self.unpackdir + "/git").strip()
|
||||
self.assertEqual(revision, "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
|
||||
|
||||
self.d.setVar("BB_GENERATE_MIRROR_TARBALLS", "1")
|
||||
self.d.setVar("SRCREV", "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
|
||||
fetcher = bb.fetch.Fetch([url1], self.d)
|
||||
fetcher.download()
|
||||
checkrevision(self, fetcher)
|
||||
# Wipe out the dldir clone and the unpacked source, turn off the network and check mirror tarball works
|
||||
bb.utils.prunedir(self.dldir + "/git2/")
|
||||
bb.utils.prunedir(self.unpackdir)
|
||||
self.d.setVar("BB_NO_NETWORK", "1")
|
||||
fetcher = bb.fetch.Fetch([url2], self.d)
|
||||
fetcher.download()
|
||||
checkrevision(self, fetcher)
|
||||
|
||||
def test_gitfetch(self):
|
||||
url1 = url2 = "git://git.openembedded.org/bitbake"
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror(self):
|
||||
url1 = "git://git.openembedded.org/bitbake"
|
||||
url2 = "git://someserver.org/bitbake"
|
||||
self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror2(self):
|
||||
url1 = url2 = "git://someserver.org/bitbake"
|
||||
self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror3(self):
|
||||
realurl = "git://git.openembedded.org/bitbake"
|
||||
dummyurl = "git://someserver.org/bitbake"
|
||||
self.sourcedir = self.unpackdir.replace("unpacked", "sourcemirror.git")
|
||||
os.chdir(self.tempdir)
|
||||
subprocess.check_output("git clone %s %s 2> /dev/null" % (realurl, self.sourcedir), shell=True)
|
||||
self.d.setVar("PREMIRRORS", "%s git://%s;protocol=file \n" % (dummyurl, self.sourcedir))
|
||||
self.gitfetcher(dummyurl, dummyurl)
|
||||
|
||||
def test_urireplace(self):
|
||||
for k, v in self.replaceuris.items():
|
||||
ud = bb.fetch.FetchData(k[0], self.d)
|
||||
@@ -330,85 +168,13 @@ class MirrorUriTest(FetcherTest):
|
||||
uris, uds = bb.fetch2.build_mirroruris(fetcher, mirrors, self.d)
|
||||
self.assertEqual(uris, ['file:///someotherpath/downloads/bitbake-1.0.tar.gz'])
|
||||
|
||||
class FetcherNetworkTest(FetcherTest):
|
||||
|
||||
if os.environ.get("BB_SKIP_NETTESTS") == "yes":
|
||||
print("Unset BB_SKIP_NETTESTS to run network tests")
|
||||
else:
|
||||
def test_fetch(self):
|
||||
fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.1.tar.gz"), 57892)
|
||||
self.d.setVar("BB_NO_NETWORK", "1")
|
||||
fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
fetcher.unpack(self.unpackdir)
|
||||
self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.0/")), 9)
|
||||
self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.1/")), 9)
|
||||
|
||||
def test_fetch_mirror(self):
|
||||
self.d.setVar("MIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
|
||||
fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
|
||||
def test_fetch_premirror(self):
|
||||
self.d.setVar("PREMIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
|
||||
fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
|
||||
fetcher.download()
|
||||
self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
|
||||
|
||||
def gitfetcher(self, url1, url2):
|
||||
def checkrevision(self, fetcher):
|
||||
fetcher.unpack(self.unpackdir)
|
||||
revision = bb.process.run("git rev-parse HEAD", shell=True, cwd=self.unpackdir + "/git")[0].strip()
|
||||
self.assertEqual(revision, "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
|
||||
|
||||
self.d.setVar("BB_GENERATE_MIRROR_TARBALLS", "1")
|
||||
self.d.setVar("SRCREV", "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
|
||||
fetcher = bb.fetch.Fetch([url1], self.d)
|
||||
fetcher.download()
|
||||
checkrevision(self, fetcher)
|
||||
# Wipe out the dldir clone and the unpacked source, turn off the network and check mirror tarball works
|
||||
bb.utils.prunedir(self.dldir + "/git2/")
|
||||
bb.utils.prunedir(self.unpackdir)
|
||||
self.d.setVar("BB_NO_NETWORK", "1")
|
||||
fetcher = bb.fetch.Fetch([url2], self.d)
|
||||
fetcher.download()
|
||||
checkrevision(self, fetcher)
|
||||
|
||||
def test_gitfetch(self):
|
||||
url1 = url2 = "git://git.openembedded.org/bitbake"
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror(self):
|
||||
url1 = "git://git.openembedded.org/bitbake"
|
||||
url2 = "git://someserver.org/bitbake"
|
||||
self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror2(self):
|
||||
url1 = url2 = "git://someserver.org/bitbake"
|
||||
self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
|
||||
self.gitfetcher(url1, url2)
|
||||
|
||||
def test_gitfetch_premirror3(self):
|
||||
realurl = "git://git.openembedded.org/bitbake"
|
||||
dummyurl = "git://someserver.org/bitbake"
|
||||
self.sourcedir = self.unpackdir.replace("unpacked", "sourcemirror.git")
|
||||
os.chdir(self.tempdir)
|
||||
bb.process.run("git clone %s %s 2> /dev/null" % (realurl, self.sourcedir), shell=True)
|
||||
self.d.setVar("PREMIRRORS", "%s git://%s;protocol=file \n" % (dummyurl, self.sourcedir))
|
||||
self.gitfetcher(dummyurl, dummyurl)
|
||||
|
||||
class URLHandle(unittest.TestCase):
|
||||
|
||||
datatable = {
|
||||
"http://www.google.com/index.html" : ('http', 'www.google.com', '/index.html', '', '', {}),
|
||||
"cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg" : ('cvs', 'cvs.handhelds.org', '/cvs', 'anoncvs', '', {'module': 'familiar/dist/ipkg'}),
|
||||
"cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg" : ('cvs', 'cvs.handhelds.org', '/cvs', 'anoncvs', 'anonymous', {'tag': 'V0-99-81', 'module': 'familiar/dist/ipkg'}),
|
||||
"git://git.openembedded.org/bitbake;branch=@foo" : ('git', 'git.openembedded.org', '/bitbake', '', '', {'branch': '@foo'})
|
||||
"cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg" : ('cvs', 'cvs.handhelds.org', '/cvs', 'anoncvs', 'anonymous', {'tag': 'V0-99-81', 'module': 'familiar/dist/ipkg'})
|
||||
}
|
||||
|
||||
def test_decodeurl(self):
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
# ex:ts=4:sw=4:sts=4:et
|
||||
# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
|
||||
#
|
||||
# BitBake Tests for utils.py
|
||||
#
|
||||
|
||||
@@ -29,17 +29,15 @@ from bb.cooker import state
|
||||
import bb.fetch2
|
||||
|
||||
class Tinfoil:
|
||||
def __init__(self, output=sys.stdout):
|
||||
def __init__(self):
|
||||
# Needed to avoid deprecation warnings with python 2.6
|
||||
warnings.filterwarnings("ignore", category=DeprecationWarning)
|
||||
|
||||
# Set up logging
|
||||
self.logger = logging.getLogger('BitBake')
|
||||
console = logging.StreamHandler(output)
|
||||
bb.msg.addDefaultlogFilter(console)
|
||||
console = logging.StreamHandler(sys.stdout)
|
||||
format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
|
||||
if output.isatty():
|
||||
format.enable_color()
|
||||
bb.msg.addDefaultlogFilter(console)
|
||||
console.setFormatter(format)
|
||||
self.logger.addHandler(console)
|
||||
|
||||
|
||||
@@ -204,6 +204,8 @@ class BuildDetailsPage (HobPage):
|
||||
def add_build_fail_top_bar(self, actions, log_file=None):
|
||||
primary_action = "Edit %s" % actions
|
||||
|
||||
self.notebook.set_page("Issues")
|
||||
|
||||
color = HobColors.ERROR
|
||||
build_fail_top = gtk.EventBox()
|
||||
#build_fail_top.set_size_request(-1, 200)
|
||||
@@ -224,16 +226,7 @@ class BuildDetailsPage (HobPage):
|
||||
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0.0, 0.5)
|
||||
# Ensure variable disk_full is defined
|
||||
if not hasattr(self.builder, 'disk_full'):
|
||||
self.builder.disk_full = False
|
||||
|
||||
if self.builder.disk_full:
|
||||
markup = "<span size='medium'>There is no disk space left, so Hob cannot finish building your image. Free up some disk space\n"
|
||||
markup += "and restart the build. Check the \"Issues\" tab for more details</span>"
|
||||
label.set_markup(markup)
|
||||
else:
|
||||
label.set_markup("<span size='medium'>Check the \"Issues\" information for more details</span>")
|
||||
label.set_markup("<span size='medium'>Check the \"Issues\" information for more details</span>")
|
||||
build_fail_tab.attach(label, 4, 40, 4, 9)
|
||||
|
||||
# create button 'Edit packages'
|
||||
@@ -241,36 +234,22 @@ class BuildDetailsPage (HobPage):
|
||||
#action_button.set_size_request(-1, 40)
|
||||
action_button.set_tooltip_text("Edit the %s parameters" % actions)
|
||||
action_button.connect('clicked', self.failure_primary_action_button_clicked_cb, primary_action)
|
||||
build_fail_tab.attach(action_button, 4, 13, 9, 12)
|
||||
|
||||
if log_file:
|
||||
open_log_button = HobAltButton("Open log")
|
||||
open_log_button.set_relief(gtk.RELIEF_HALF)
|
||||
open_log_button.set_tooltip_text("Open the build's log file")
|
||||
open_log_button.connect('clicked', self.open_log_button_clicked_cb, log_file)
|
||||
build_fail_tab.attach(open_log_button, 14, 23, 9, 12)
|
||||
|
||||
attach_pos = (24 if log_file else 14)
|
||||
file_bug_button = HobAltButton('File a bug')
|
||||
file_bug_button.set_relief(gtk.RELIEF_HALF)
|
||||
file_bug_button.set_tooltip_text("Open the Yocto Project bug tracking website")
|
||||
file_bug_button.connect('clicked', self.failure_activate_file_bug_link_cb)
|
||||
build_fail_tab.attach(file_bug_button, attach_pos, attach_pos + 9, 9, 12)
|
||||
|
||||
if not self.builder.disk_full:
|
||||
build_fail_tab.attach(action_button, 4, 13, 9, 12)
|
||||
if log_file:
|
||||
build_fail_tab.attach(open_log_button, 14, 23, 9, 12)
|
||||
build_fail_tab.attach(file_bug_button, attach_pos, attach_pos + 9, 9, 12)
|
||||
|
||||
else:
|
||||
restart_build = HobButton("Restart the build")
|
||||
restart_build.set_tooltip_text("Restart the build")
|
||||
restart_build.connect('clicked', self.restart_build_button_clicked_cb)
|
||||
|
||||
build_fail_tab.attach(restart_build, 4, 13, 9, 12)
|
||||
build_fail_tab.attach(action_button, 14, 23, 9, 12)
|
||||
if log_file:
|
||||
build_fail_tab.attach(open_log_button, attach_pos, attach_pos + 9, 9, 12)
|
||||
|
||||
self.builder.disk_full = False
|
||||
return build_fail_top
|
||||
|
||||
def show_fail_page(self, title):
|
||||
@@ -285,7 +264,6 @@ class BuildDetailsPage (HobPage):
|
||||
|
||||
self.vbox.pack_start(self.notebook, expand=True, fill=True)
|
||||
self.show_all()
|
||||
self.notebook.set_page("Issues")
|
||||
self.back_button.hide()
|
||||
|
||||
def add_build_stop_top_bar(self, action, log_file=None):
|
||||
@@ -370,7 +348,6 @@ class BuildDetailsPage (HobPage):
|
||||
|
||||
self.box_group_area.pack_end(self.button_box, expand=False, fill=False)
|
||||
self.show_all()
|
||||
self.notebook.set_page("Log")
|
||||
self.back_button.hide()
|
||||
|
||||
self.reset_build_status()
|
||||
@@ -416,9 +393,6 @@ class BuildDetailsPage (HobPage):
|
||||
elif "Edit image" in action:
|
||||
self.builder.show_configuration()
|
||||
|
||||
def restart_build_button_clicked_cb(self, button):
|
||||
self.builder.just_bake()
|
||||
|
||||
def stop_primary_action_button_clicked_cb(self, button, action):
|
||||
if "recipes" in action:
|
||||
self.builder.show_recipes()
|
||||
@@ -429,8 +403,7 @@ class BuildDetailsPage (HobPage):
|
||||
|
||||
def open_log_button_clicked_cb(self, button, log_file):
|
||||
if log_file:
|
||||
log_file = "file:///" + log_file
|
||||
gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
|
||||
os.system("xdg-open /%s" % log_file)
|
||||
|
||||
def failure_activate_file_bug_link_cb(self, button):
|
||||
button.child.emit('activate-link', "http://bugzilla.yoctoproject.org")
|
||||
|
||||
@@ -38,16 +38,11 @@ from bb.ui.crumbs.builddetailspage import BuildDetailsPage
|
||||
from bb.ui.crumbs.imagedetailspage import ImageDetailsPage
|
||||
from bb.ui.crumbs.sanitycheckpage import SanityCheckPage
|
||||
from bb.ui.crumbs.hobwidget import hwc, HobButton, HobAltButton
|
||||
from bb.ui.crumbs.hig import CrumbsMessageDialog, ImageSelectionDialog, \
|
||||
AdvancedSettingDialog, SimpleSettingsDialog, \
|
||||
LayerSelectionDialog, DeployImageDialog
|
||||
from bb.ui.crumbs.persistenttooltip import PersistentTooltip
|
||||
import bb.ui.crumbs.utils
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hig.simplesettingsdialog import SimpleSettingsDialog
|
||||
from bb.ui.crumbs.hig.advancedsettingsdialog import AdvancedSettingsDialog
|
||||
from bb.ui.crumbs.hig.deployimagedialog import DeployImageDialog
|
||||
from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
|
||||
from bb.ui.crumbs.hig.imageselectiondialog import ImageSelectionDialog
|
||||
from bb.ui.crumbs.hig.parsingwarningsdialog import ParsingWarningsDialog
|
||||
from bb.ui.crumbs.hig.propertydialog import PropertyDialog
|
||||
|
||||
hobVer = 20120808
|
||||
|
||||
@@ -56,10 +51,10 @@ class Configuration:
|
||||
|
||||
@classmethod
|
||||
def parse_proxy_string(cls, proxy):
|
||||
pattern = "^\s*((http|https|ftp|socks|cvs)://)?((\S+):(\S+)@)?([^\s:]+)(:(\d+))?/?"
|
||||
pattern = "^\s*((http|https|ftp|git|cvs)://)?((\S+):(\S+)@)?(\S+):(\d+)/?"
|
||||
match = re.search(pattern, proxy)
|
||||
if match:
|
||||
return match.group(2), match.group(4), match.group(5), match.group(6), match.group(8)
|
||||
return match.group(2), match.group(4), match.group(5), match.group(6), match.group(7)
|
||||
else:
|
||||
return None, None, None, "", ""
|
||||
|
||||
@@ -87,14 +82,13 @@ class Configuration:
|
||||
|
||||
@classmethod
|
||||
def make_proxy_string(cls, prot, user, passwd, host, port, default_prot=""):
|
||||
if host == None or host == "":# or port == None or port == "":
|
||||
if host == None or host == "" or port == None or port == "":
|
||||
return ""
|
||||
|
||||
return Configuration.make_host_string(prot, user, passwd, host, default_prot) + (":" + Configuration.make_port_string(port) if port else "")
|
||||
return Configuration.make_host_string(prot, user, passwd, host, default_prot) + ":" + Configuration.make_port_string(port)
|
||||
|
||||
def __init__(self):
|
||||
self.curr_mach = ""
|
||||
self.selected_image = None
|
||||
# settings
|
||||
self.curr_distro = ""
|
||||
self.dldir = self.sstatedir = self.sstatemirror = ""
|
||||
@@ -124,14 +118,14 @@ class Configuration:
|
||||
"http" : [None, None, None, "", ""], # protocol : [prot, user, passwd, host, port]
|
||||
"https" : [None, None, None, "", ""],
|
||||
"ftp" : [None, None, None, "", ""],
|
||||
"socks" : [None, None, None, "", ""],
|
||||
"git" : [None, None, None, "", ""],
|
||||
"cvs" : [None, None, None, "", ""],
|
||||
}
|
||||
|
||||
def clear_selection(self):
|
||||
self.selected_image = None
|
||||
self.selected_recipes = []
|
||||
self.selected_packages = []
|
||||
self.initial_selected_image = None
|
||||
self.initial_selected_packages = []
|
||||
self.initial_user_selected_packages = []
|
||||
|
||||
@@ -177,20 +171,33 @@ class Configuration:
|
||||
# self.extra_setting/self.toolchain_build
|
||||
# bblayers.conf
|
||||
self.layers = params["layer"].split()
|
||||
self.layers_non_removable = params["layers_non_removable"].split()
|
||||
self.default_task = params["default_task"]
|
||||
|
||||
# proxy settings
|
||||
self.enable_proxy = params["http_proxy"] != "" or params["https_proxy"] != "" \
|
||||
or params["ftp_proxy"] != "" or params["socks_proxy"] != "" \
|
||||
self.enable_proxy = params["http_proxy"] != "" or params["https_proxy"] != "" or params["ftp_proxy"] != "" \
|
||||
or params["git_proxy_host"] != "" or params["git_proxy_port"] != "" \
|
||||
or params["cvs_proxy_host"] != "" or params["cvs_proxy_port"] != ""
|
||||
self.split_proxy("http", params["http_proxy"])
|
||||
self.split_proxy("https", params["https_proxy"])
|
||||
self.split_proxy("ftp", params["ftp_proxy"])
|
||||
self.split_proxy("socks", params["socks_proxy"])
|
||||
self.split_proxy("git", params["git_proxy_host"] + ":" + params["git_proxy_port"])
|
||||
self.split_proxy("cvs", params["cvs_proxy_host"] + ":" + params["cvs_proxy_port"])
|
||||
|
||||
def load(self, template):
|
||||
self.curr_mach = template.getVar("MACHINE")
|
||||
self.curr_package_format = " ".join(template.getVar("PACKAGE_CLASSES").split("package_")).strip()
|
||||
self.curr_distro = template.getVar("DISTRO")
|
||||
self.dldir = template.getVar("DL_DIR")
|
||||
self.sstatedir = template.getVar("SSTATE_DIR")
|
||||
self.sstatemirror = template.getVar("SSTATE_MIRRORS")
|
||||
try:
|
||||
self.pmake = int(template.getVar("PARALLEL_MAKE").split()[1])
|
||||
except:
|
||||
pass
|
||||
try:
|
||||
self.bbthread = int(template.getVar("BB_NUMBER_THREADS"))
|
||||
except:
|
||||
pass
|
||||
try:
|
||||
self.image_rootfs_size = int(template.getVar("IMAGE_ROOTFS_SIZE"))
|
||||
except:
|
||||
@@ -202,9 +209,13 @@ class Configuration:
|
||||
# image_overhead_factor is read-only.
|
||||
self.incompat_license = template.getVar("INCOMPATIBLE_LICENSE")
|
||||
self.curr_sdk_machine = template.getVar("SDKMACHINE")
|
||||
self.conf_version = template.getVar("CONF_VERSION")
|
||||
self.lconf_version = template.getVar("LCONF_VERSION")
|
||||
self.extra_setting = eval(template.getVar("EXTRA_SETTING"))
|
||||
self.toolchain_build = eval(template.getVar("TOOLCHAIN_BUILD"))
|
||||
self.image_fstypes = template.getVar("IMAGE_FSTYPES")
|
||||
# bblayers.conf
|
||||
self.layers = template.getVar("BBLAYERS").split()
|
||||
# image/recipes/packages
|
||||
self.selected_image = template.getVar("__SELECTED_IMAGE__")
|
||||
self.selected_recipes = template.getVar("DEPENDS").split()
|
||||
@@ -215,35 +226,29 @@ class Configuration:
|
||||
self.split_proxy("http", template.getVar("http_proxy"))
|
||||
self.split_proxy("https", template.getVar("https_proxy"))
|
||||
self.split_proxy("ftp", template.getVar("ftp_proxy"))
|
||||
self.split_proxy("socks", template.getVar("all_proxy"))
|
||||
self.split_proxy("git", template.getVar("GIT_PROXY_HOST") + ":" + template.getVar("GIT_PROXY_PORT"))
|
||||
self.split_proxy("cvs", template.getVar("CVS_PROXY_HOST") + ":" + template.getVar("CVS_PROXY_PORT"))
|
||||
|
||||
def save(self, handler, template, defaults=False):
|
||||
def save(self, template, defaults=False):
|
||||
template.setVar("VERSION", "%s" % hobVer)
|
||||
# bblayers.conf
|
||||
handler.set_var_in_file("BBLAYERS", self.layers, "bblayers.conf")
|
||||
template.setVar("BBLAYERS", " ".join(self.layers))
|
||||
# local.conf
|
||||
if not defaults:
|
||||
handler.set_var_in_file("MACHINE", self.curr_mach, "local.conf")
|
||||
handler.set_var_in_file("DISTRO", self.curr_distro, "local.conf")
|
||||
handler.set_var_in_file("DL_DIR", self.dldir, "local.conf")
|
||||
handler.set_var_in_file("SSTATE_DIR", self.sstatedir, "local.conf")
|
||||
sstate_mirror_list = self.sstatemirror.split("\\n ")
|
||||
sstate_mirror_list_modified = []
|
||||
for mirror in sstate_mirror_list:
|
||||
if mirror != "":
|
||||
mirror = mirror + "\\n"
|
||||
sstate_mirror_list_modified.append(mirror)
|
||||
handler.set_var_in_file("SSTATE_MIRRORS", sstate_mirror_list_modified, "local.conf")
|
||||
handler.set_var_in_file("PARALLEL_MAKE", "-j %s" % self.pmake, "local.conf")
|
||||
handler.set_var_in_file("BB_NUMBER_THREADS", self.bbthread, "local.conf")
|
||||
handler.set_var_in_file("PACKAGE_CLASSES", " ".join(["package_" + i for i in self.curr_package_format.split()]), "local.conf")
|
||||
template.setVar("MACHINE", self.curr_mach)
|
||||
template.setVar("DISTRO", self.curr_distro)
|
||||
template.setVar("DL_DIR", self.dldir)
|
||||
template.setVar("SSTATE_DIR", self.sstatedir)
|
||||
template.setVar("SSTATE_MIRRORS", self.sstatemirror)
|
||||
template.setVar("PARALLEL_MAKE", "-j %s" % self.pmake)
|
||||
template.setVar("BB_NUMBER_THREADS", self.bbthread)
|
||||
template.setVar("PACKAGE_CLASSES", " ".join(["package_" + i for i in self.curr_package_format.split()]))
|
||||
template.setVar("IMAGE_ROOTFS_SIZE", self.image_rootfs_size)
|
||||
template.setVar("IMAGE_EXTRA_SPACE", self.image_extra_size)
|
||||
template.setVar("INCOMPATIBLE_LICENSE", self.incompat_license)
|
||||
template.setVar("SDKMACHINE", self.curr_sdk_machine)
|
||||
handler.set_var_in_file("CONF_VERSION", self.conf_version, "local.conf")
|
||||
handler.set_var_in_file("LCONF_VERSION", self.lconf_version, "bblayers.conf")
|
||||
template.setVar("CONF_VERSION", self.conf_version)
|
||||
template.setVar("LCONF_VERSION", self.lconf_version)
|
||||
template.setVar("EXTRA_SETTING", self.extra_setting)
|
||||
template.setVar("TOOLCHAIN_BUILD", self.toolchain_build)
|
||||
template.setVar("IMAGE_FSTYPES", self.image_fstypes)
|
||||
@@ -258,7 +263,8 @@ class Configuration:
|
||||
template.setVar("http_proxy", self.combine_proxy("http"))
|
||||
template.setVar("https_proxy", self.combine_proxy("https"))
|
||||
template.setVar("ftp_proxy", self.combine_proxy("ftp"))
|
||||
template.setVar("all_proxy", self.combine_proxy("socks"))
|
||||
template.setVar("GIT_PROXY_HOST", self.combine_host_only("git"))
|
||||
template.setVar("GIT_PROXY_PORT", self.combine_port_only("git"))
|
||||
template.setVar("CVS_PROXY_HOST", self.combine_host_only("cvs"))
|
||||
template.setVar("CVS_PROXY_PORT", self.combine_port_only("cvs"))
|
||||
|
||||
@@ -273,8 +279,8 @@ class Configuration:
|
||||
(self.lconf_version, self.extra_setting, self.toolchain_build, self.image_fstypes, self.selected_image)
|
||||
s += "DEPENDS: '%s', IMAGE_INSTALL: '%s', enable_proxy: '%s', use_same_proxy: '%s', http_proxy: '%s', " % \
|
||||
(self.selected_recipes, self.user_selected_packages, self.enable_proxy, self.same_proxy, self.combine_proxy("http"))
|
||||
s += "https_proxy: '%s', ftp_proxy: '%s', all_proxy: '%s', CVS_PROXY_HOST: '%s', CVS_PROXY_PORT: '%s'" % \
|
||||
(self.combine_proxy("https"), self.combine_proxy("ftp"), self.combine_proxy("socks"),
|
||||
s += "https_proxy: '%s', ftp_proxy: '%s', GIT_PROXY_HOST: '%s', GIT_PROXY_PORT: '%s', CVS_PROXY_HOST: '%s', CVS_PROXY_PORT: '%s'" % \
|
||||
(self.combine_proxy("https"), self.combine_proxy("ftp"),self.combine_host_only("git"), self.combine_port_only("git"),
|
||||
self.combine_host_only("cvs"), self.combine_port_only("cvs"))
|
||||
return s
|
||||
|
||||
@@ -432,12 +438,6 @@ class Builder(gtk.Window):
|
||||
# Indicate whether the UI is working
|
||||
self.sensitive = True
|
||||
|
||||
# Indicate whether the sanity check ran
|
||||
self.sanity_checked = False
|
||||
|
||||
# save parsing warnings
|
||||
self.parsing_warnings = []
|
||||
|
||||
# create visual elements
|
||||
self.create_visual_elements()
|
||||
|
||||
@@ -455,20 +455,19 @@ class Builder(gtk.Window):
|
||||
self.handler.build.connect("build-failed", self.handler_build_failed_cb)
|
||||
self.handler.build.connect("build-aborted", self.handler_build_aborted_cb)
|
||||
self.handler.build.connect("task-started", self.handler_task_started_cb)
|
||||
self.handler.build.connect("disk-full", self.handler_disk_full_cb)
|
||||
self.handler.build.connect("log-error", self.handler_build_failure_cb)
|
||||
self.handler.build.connect("log-warning", self.handler_build_failure_cb)
|
||||
self.handler.build.connect("log", self.handler_build_log_cb)
|
||||
self.handler.build.connect("no-provider", self.handler_no_provider_cb)
|
||||
self.handler.connect("generating-data", self.handler_generating_data_cb)
|
||||
self.handler.connect("data-generated", self.handler_data_generated_cb)
|
||||
self.handler.connect("command-succeeded", self.handler_command_succeeded_cb)
|
||||
self.handler.connect("command-failed", self.handler_command_failed_cb)
|
||||
self.handler.connect("parsing-warning", self.handler_parsing_warning_cb)
|
||||
self.handler.connect("sanity-failed", self.handler_sanity_failed_cb)
|
||||
self.handler.connect("recipe-populated", self.handler_recipe_populated_cb)
|
||||
self.handler.connect("package-populated", self.handler_package_populated_cb)
|
||||
|
||||
self.handler.set_config_filter(hob_conf_filter)
|
||||
|
||||
self.initiate_new_build_async()
|
||||
|
||||
def create_visual_elements(self):
|
||||
@@ -541,16 +540,16 @@ class Builder(gtk.Window):
|
||||
sanity_check_post_func = func
|
||||
|
||||
def generate_configuration(self):
|
||||
if not self.sanity_checked:
|
||||
self.show_sanity_check_page()
|
||||
self.show_sanity_check_page()
|
||||
self.handler.generate_configuration()
|
||||
|
||||
def initiate_new_build_async(self):
|
||||
self.switch_page(self.MACHINE_SELECTION)
|
||||
self.handler.init_cooker()
|
||||
self.handler.set_extra_inherit("image_types")
|
||||
self.generate_configuration()
|
||||
self.load_template(TemplateMgr.convert_to_template_pathfilename("default", ".hob/"))
|
||||
if self.load_template(TemplateMgr.convert_to_template_pathfilename("default", ".hob/")) == False:
|
||||
self.show_sanity_check_page()
|
||||
self.handler.init_cooker()
|
||||
self.handler.set_extra_inherit("image_types")
|
||||
self.generate_configuration()
|
||||
|
||||
def update_config_async(self):
|
||||
self.switch_page(self.MACHINE_SELECTION)
|
||||
@@ -604,18 +603,15 @@ class Builder(gtk.Window):
|
||||
# Build image
|
||||
self.set_user_config()
|
||||
toolchain_packages = []
|
||||
base_image = None
|
||||
if self.configuration.toolchain_build:
|
||||
toolchain_packages = self.package_model.get_selected_packages_toolchain()
|
||||
if self.configuration.selected_image == self.recipe_model.__custom_image__:
|
||||
packages = self.package_model.get_selected_packages()
|
||||
image = self.hob_image
|
||||
base_image = self.configuration.initial_selected_image
|
||||
else:
|
||||
packages = []
|
||||
image = self.configuration.selected_image
|
||||
self.handler.generate_image(image,
|
||||
base_image,
|
||||
self.hob_toolchain,
|
||||
packages,
|
||||
toolchain_packages,
|
||||
@@ -659,7 +655,8 @@ class Builder(gtk.Window):
|
||||
if not os.path.exists(layer+'/conf/layer.conf'):
|
||||
return False
|
||||
|
||||
self.set_user_config_extra()
|
||||
self.save_defaults() # remember layers and settings
|
||||
self.update_config_async()
|
||||
return True
|
||||
|
||||
def save_template(self, path, defaults=False):
|
||||
@@ -673,7 +670,7 @@ class Builder(gtk.Window):
|
||||
self.template = TemplateMgr()
|
||||
try:
|
||||
self.template.open(filename, path)
|
||||
self.configuration.save(self.handler, self.template, defaults)
|
||||
self.configuration.save(self.template, defaults)
|
||||
|
||||
self.template.save()
|
||||
except Exception as e:
|
||||
@@ -749,31 +746,6 @@ class Builder(gtk.Window):
|
||||
self.previous_step = self.current_step
|
||||
self.current_step = next_step
|
||||
|
||||
def set_user_config_proxies(self):
|
||||
if self.configuration.enable_proxy == True:
|
||||
self.handler.set_http_proxy(self.configuration.combine_proxy("http"))
|
||||
self.handler.set_https_proxy(self.configuration.combine_proxy("https"))
|
||||
self.handler.set_ftp_proxy(self.configuration.combine_proxy("ftp"))
|
||||
self.handler.set_socks_proxy(self.configuration.combine_proxy("socks"))
|
||||
self.handler.set_cvs_proxy(self.configuration.combine_host_only("cvs"), self.configuration.combine_port_only("cvs"))
|
||||
elif self.configuration.enable_proxy == False:
|
||||
self.handler.set_http_proxy("")
|
||||
self.handler.set_https_proxy("")
|
||||
self.handler.set_ftp_proxy("")
|
||||
self.handler.set_socks_proxy("")
|
||||
self.handler.set_cvs_proxy("", "")
|
||||
|
||||
def set_user_config_extra(self):
|
||||
self.handler.set_rootfs_size(self.configuration.image_rootfs_size)
|
||||
self.handler.set_extra_size(self.configuration.image_extra_size)
|
||||
self.handler.set_incompatible_license(self.configuration.incompat_license)
|
||||
self.handler.set_sdk_machine(self.configuration.curr_sdk_machine)
|
||||
self.handler.set_image_fstypes(self.configuration.image_fstypes)
|
||||
self.handler.set_extra_config(self.configuration.extra_setting)
|
||||
self.handler.set_extra_inherit("packageinfo")
|
||||
self.handler.set_extra_inherit("image_types")
|
||||
self.set_user_config_proxies()
|
||||
|
||||
def set_user_config(self):
|
||||
self.handler.init_cooker()
|
||||
# set bb layers
|
||||
@@ -787,7 +759,27 @@ class Builder(gtk.Window):
|
||||
self.handler.set_sstate_mirrors(self.configuration.sstatemirror)
|
||||
self.handler.set_pmake(self.configuration.pmake)
|
||||
self.handler.set_bbthreads(self.configuration.bbthread)
|
||||
self.set_user_config_extra()
|
||||
self.handler.set_rootfs_size(self.configuration.image_rootfs_size)
|
||||
self.handler.set_extra_size(self.configuration.image_extra_size)
|
||||
self.handler.set_incompatible_license(self.configuration.incompat_license)
|
||||
self.handler.set_sdk_machine(self.configuration.curr_sdk_machine)
|
||||
self.handler.set_image_fstypes(self.configuration.image_fstypes)
|
||||
self.handler.set_extra_config(self.configuration.extra_setting)
|
||||
self.handler.set_extra_inherit("packageinfo")
|
||||
self.handler.set_extra_inherit("image_types")
|
||||
# set proxies
|
||||
if self.configuration.enable_proxy == True:
|
||||
self.handler.set_http_proxy(self.configuration.combine_proxy("http"))
|
||||
self.handler.set_https_proxy(self.configuration.combine_proxy("https"))
|
||||
self.handler.set_ftp_proxy(self.configuration.combine_proxy("ftp"))
|
||||
self.handler.set_git_proxy(self.configuration.combine_host_only("git"), self.configuration.combine_port_only("git"))
|
||||
self.handler.set_cvs_proxy(self.configuration.combine_host_only("cvs"), self.configuration.combine_port_only("cvs"))
|
||||
elif self.configuration.enable_proxy == False:
|
||||
self.handler.set_http_proxy("")
|
||||
self.handler.set_https_proxy("")
|
||||
self.handler.set_ftp_proxy("")
|
||||
self.handler.set_git_proxy("", "")
|
||||
self.handler.set_cvs_proxy("", "")
|
||||
|
||||
def update_recipe_model(self, selected_image, selected_recipes):
|
||||
self.recipe_model.set_selected_image(selected_image)
|
||||
@@ -838,9 +830,7 @@ class Builder(gtk.Window):
|
||||
if not self.configuration.curr_mach:
|
||||
self.configuration.curr_mach = self.handler.runCommand(["getVariable", "HOB_MACHINE"]) or ""
|
||||
self.update_configuration_parameters(self.get_parameters_sync())
|
||||
if not self.sanity_checked:
|
||||
self.sanity_check()
|
||||
self.sanity_checked = True
|
||||
self.sanity_check()
|
||||
elif initcmd == self.handler.SANITY_CHECK:
|
||||
if self.had_network_error:
|
||||
self.had_network_error = False
|
||||
@@ -872,15 +862,6 @@ class Builder(gtk.Window):
|
||||
response = dialog.run()
|
||||
dialog.destroy()
|
||||
|
||||
def show_warning_dialog(self):
|
||||
dialog = ParsingWarningsDialog(title = "View warnings",
|
||||
warnings = self.parsing_warnings,
|
||||
parent = None,
|
||||
flags = gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
response = dialog.run()
|
||||
dialog.destroy()
|
||||
|
||||
def show_network_error_dialog(self):
|
||||
lbl = "<b>Hob cannot connect to the network</b>\n"
|
||||
msg = "Please check your network connection. If you are using a proxy server, please make sure it is configured correctly."
|
||||
@@ -904,9 +885,6 @@ class Builder(gtk.Window):
|
||||
self.show_error_dialog(msg)
|
||||
self.reset()
|
||||
|
||||
def handler_parsing_warning_cb(self, handler, warn_msg):
|
||||
self.parsing_warnings.append(warn_msg)
|
||||
|
||||
def handler_sanity_failed_cb(self, handler, msg, network_error):
|
||||
self.reset()
|
||||
if network_error:
|
||||
@@ -966,10 +944,10 @@ class Builder(gtk.Window):
|
||||
self.package_details_page.refresh_selection()
|
||||
|
||||
def handler_recipe_populated_cb(self, handler):
|
||||
self.image_configuration_page.update_progress_bar("Populating recipes", 0.99)
|
||||
self.image_configuration_page.update_progress_bar("Populated recipes", 0.99)
|
||||
|
||||
def handler_package_populated_cb(self, handler):
|
||||
self.image_configuration_page.update_progress_bar("Populating packages", 1.0)
|
||||
self.image_configuration_page.update_progress_bar("Populated packages", 1.0)
|
||||
|
||||
def handler_parsing_started_cb(self, handler, message):
|
||||
if self.current_step != self.RCPPKGINFO_POPULATING:
|
||||
@@ -979,10 +957,10 @@ class Builder(gtk.Window):
|
||||
if message["eventname"] == "TreeDataPreparationStarted":
|
||||
fraction = 0.6 + fraction
|
||||
self.image_configuration_page.stop_button.set_sensitive(False)
|
||||
self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
|
||||
else:
|
||||
self.image_configuration_page.stop_button.set_sensitive(True)
|
||||
self.image_configuration_page.update_progress_bar(message["title"], fraction)
|
||||
|
||||
self.image_configuration_page.update_progress_bar(message["title"], fraction)
|
||||
|
||||
def handler_parsing_cb(self, handler, message):
|
||||
if self.current_step != self.RCPPKGINFO_POPULATING:
|
||||
@@ -991,10 +969,9 @@ class Builder(gtk.Window):
|
||||
fraction = message["current"] * 1.0/message["total"]
|
||||
if message["eventname"] == "TreeDataPreparationProgress":
|
||||
fraction = 0.6 + 0.38 * fraction
|
||||
self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
|
||||
else:
|
||||
fraction = 0.6 * fraction
|
||||
self.image_configuration_page.update_progress_bar(message["title"], fraction)
|
||||
self.image_configuration_page.update_progress_bar(message["title"], fraction)
|
||||
|
||||
def handler_parsing_completed_cb(self, handler, message):
|
||||
if self.current_step != self.RCPPKGINFO_POPULATING:
|
||||
@@ -1004,7 +981,7 @@ class Builder(gtk.Window):
|
||||
fraction = 0.98
|
||||
else:
|
||||
fraction = 0.6
|
||||
self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
|
||||
self.image_configuration_page.update_progress_bar(message["title"], fraction)
|
||||
|
||||
def handler_build_started_cb(self, running_build):
|
||||
if self.current_step == self.FAST_IMAGE_GENERATING:
|
||||
@@ -1024,13 +1001,10 @@ class Builder(gtk.Window):
|
||||
fraction = 0.9
|
||||
elif self.current_step == self.IMAGE_GENERATING:
|
||||
fraction = 1.0
|
||||
version = ""
|
||||
self.parameters.image_names = []
|
||||
selected_image = self.recipe_model.get_selected_image()
|
||||
if selected_image == self.recipe_model.__custom_image__:
|
||||
if self.configuration.initial_selected_image != selected_image:
|
||||
version = self.recipe_model.get_custom_image_version()
|
||||
linkname = 'hob-image' + version+ "-" + self.configuration.curr_mach
|
||||
linkname = 'hob-image-' + self.configuration.curr_mach
|
||||
else:
|
||||
linkname = selected_image + '-' + self.configuration.curr_mach
|
||||
image_extension = self.get_image_extension()
|
||||
@@ -1143,9 +1117,6 @@ class Builder(gtk.Window):
|
||||
self.build_details_page.update_progress_bar(title + ": ", fraction)
|
||||
self.build_details_page.update_build_status(message["current"], message["total"], message["task"])
|
||||
|
||||
def handler_disk_full_cb(self, running_build):
|
||||
self.disk_full = True
|
||||
|
||||
def handler_build_failure_cb(self, running_build):
|
||||
self.build_details_page.show_issues()
|
||||
|
||||
@@ -1205,42 +1176,15 @@ class Builder(gtk.Window):
|
||||
|
||||
self.fast_generate_image_async(True)
|
||||
|
||||
def show_recipe_property_dialog(self, properties):
|
||||
information = {}
|
||||
dialog = PropertyDialog(title = properties["name"] +' '+ "properties",
|
||||
parent = self,
|
||||
information = properties,
|
||||
flags = gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
def show_binb_dialog(self, binb):
|
||||
markup = "<b>Brought in by:</b>\n%s" % binb
|
||||
ptip = PersistentTooltip(markup, self)
|
||||
|
||||
dialog.set_modal(False)
|
||||
|
||||
button = dialog.add_button("Close", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button.connect("clicked", lambda w: dialog.destroy())
|
||||
|
||||
dialog.run()
|
||||
|
||||
def show_packages_property_dialog(self, properties):
|
||||
information = {}
|
||||
dialog = PropertyDialog(title = properties["name"] +' '+ "properties",
|
||||
parent = self,
|
||||
information = properties,
|
||||
flags = gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
|
||||
dialog.set_modal(False)
|
||||
|
||||
button = dialog.add_button("Close", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button.connect("clicked", lambda w: dialog.destroy())
|
||||
|
||||
dialog.run()
|
||||
ptip.show()
|
||||
|
||||
def show_layer_selection_dialog(self):
|
||||
dialog = LayerSelectionDialog(title = "Layers",
|
||||
layers = copy.deepcopy(self.configuration.layers),
|
||||
layers_non_removable = copy.deepcopy(self.configuration.layers_non_removable),
|
||||
all_layers = self.parameters.all_layers,
|
||||
parent = self,
|
||||
flags = gtk.DIALOG_MODAL
|
||||
@@ -1259,6 +1203,39 @@ class Builder(gtk.Window):
|
||||
self.update_config_async()
|
||||
dialog.destroy()
|
||||
|
||||
def show_load_template_dialog(self):
|
||||
dialog = gtk.FileChooserDialog("Load Template Files", self,
|
||||
gtk.FILE_CHOOSER_ACTION_OPEN)
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Open", gtk.RESPONSE_YES)
|
||||
HobButton.style_button(button)
|
||||
filter = gtk.FileFilter()
|
||||
filter.set_name("Hob Files")
|
||||
filter.add_pattern("*.hob")
|
||||
dialog.add_filter(filter)
|
||||
|
||||
response = dialog.run()
|
||||
path = None
|
||||
if response == gtk.RESPONSE_YES:
|
||||
path = dialog.get_filename()
|
||||
dialog.destroy()
|
||||
return response == gtk.RESPONSE_YES, path
|
||||
|
||||
def show_save_template_dialog(self):
|
||||
dialog = gtk.FileChooserDialog("Save Template Files", self,
|
||||
gtk.FILE_CHOOSER_ACTION_SAVE)
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Save", gtk.RESPONSE_YES)
|
||||
HobButton.style_button(button)
|
||||
dialog.set_current_name("hob")
|
||||
response = dialog.run()
|
||||
if response == gtk.RESPONSE_YES:
|
||||
path = dialog.get_filename()
|
||||
self.save_template(path)
|
||||
dialog.destroy()
|
||||
|
||||
def get_image_extension(self):
|
||||
image_extension = {}
|
||||
for type in self.parameters.image_types:
|
||||
@@ -1297,7 +1274,7 @@ class Builder(gtk.Window):
|
||||
dialog.destroy()
|
||||
|
||||
def show_adv_settings_dialog(self, tab=None):
|
||||
dialog = AdvancedSettingsDialog(title = "Advanced configuration",
|
||||
dialog = AdvancedSettingDialog(title = "Advanced configuration",
|
||||
configuration = copy.deepcopy(self.configuration),
|
||||
all_image_types = self.parameters.image_types,
|
||||
all_package_formats = self.parameters.all_package_formats,
|
||||
@@ -1333,8 +1310,7 @@ class Builder(gtk.Window):
|
||||
parent = self,
|
||||
flags = gtk.DIALOG_MODAL
|
||||
| gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR,
|
||||
handler = self.handler)
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Save", gtk.RESPONSE_YES)
|
||||
@@ -1347,14 +1323,6 @@ class Builder(gtk.Window):
|
||||
self.configuration = dialog.configuration
|
||||
self.save_defaults() # remember settings
|
||||
settings_changed = dialog.settings_changed
|
||||
if dialog.proxy_settings_changed:
|
||||
self.set_user_config_proxies()
|
||||
elif dialog.proxy_test_ran:
|
||||
# The user might have modified the proxies in the "Proxy"
|
||||
# tab, which in turn made the proxy settings modify in bb.
|
||||
# If "Cancel" was pressed, restore the previous proxy
|
||||
# settings inside bb.
|
||||
self.set_user_config_proxies()
|
||||
dialog.destroy()
|
||||
return response == gtk.RESPONSE_YES, settings_changed
|
||||
|
||||
@@ -1512,7 +1480,7 @@ class Builder(gtk.Window):
|
||||
if response != gtk.RESPONSE_CANCEL:
|
||||
self.stopping = True
|
||||
if response == gtk.RESPONSE_OK:
|
||||
self.build_details_page.progress_bar.set_stop_title("Stopping the build....")
|
||||
self.build_details_page.progress_bar.set_title("Stopping the build...")
|
||||
self.build_details_page.progress_bar.set_rcstyle("stop")
|
||||
self.cancel_build_sync()
|
||||
elif response == gtk.RESPONSE_YES:
|
||||
|
||||
1753
bitbake/lib/bb/ui/crumbs/hig.py
Normal file
1753
bitbake/lib/bb/ui/crumbs/hig.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,336 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import hashlib
|
||||
from bb.ui.crumbs.hobwidget import HobInfoButton, HobButton
|
||||
from bb.ui.crumbs.progressbar import HobProgressBar
|
||||
from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hig.proxydetailsdialog import ProxyDetailsDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class AdvancedSettingsDialog (CrumbsDialog, SettingsUIHelper):
|
||||
|
||||
def details_cb(self, button, parent, protocol):
|
||||
dialog = ProxyDetailsDialog(title = protocol.upper() + " Proxy Details",
|
||||
user = self.configuration.proxies[protocol][1],
|
||||
passwd = self.configuration.proxies[protocol][2],
|
||||
parent = parent,
|
||||
flags = gtk.DIALOG_MODAL
|
||||
| gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
|
||||
response = dialog.run()
|
||||
if response == gtk.RESPONSE_OK:
|
||||
self.configuration.proxies[protocol][1] = dialog.user
|
||||
self.configuration.proxies[protocol][2] = dialog.passwd
|
||||
self.refresh_proxy_components()
|
||||
dialog.destroy()
|
||||
|
||||
def set_save_button(self, button):
|
||||
self.save_button = button
|
||||
|
||||
def rootfs_combo_changed_cb(self, rootfs_combo, all_package_format, check_hbox):
|
||||
combo_item = self.rootfs_combo.get_active_text()
|
||||
modified = False
|
||||
for child in check_hbox.get_children():
|
||||
if isinstance(child, gtk.CheckButton):
|
||||
check_hbox.remove(child)
|
||||
modified = True
|
||||
for format in all_package_format:
|
||||
if format != combo_item:
|
||||
check_button = gtk.CheckButton(format)
|
||||
check_hbox.pack_start(check_button, expand=False, fill=False)
|
||||
modified = True
|
||||
if modified:
|
||||
check_hbox.remove(self.pkgfmt_info)
|
||||
check_hbox.pack_start(self.pkgfmt_info, expand=False, fill=False)
|
||||
check_hbox.show_all()
|
||||
|
||||
def gen_pkgfmt_widget(self, curr_package_format, all_package_format, tooltip_combo="", tooltip_extra=""):
|
||||
pkgfmt_vbox = gtk.VBox(False, 6)
|
||||
|
||||
label = self.gen_label_widget("Root file system package format")
|
||||
pkgfmt_vbox.pack_start(label, expand=False, fill=False)
|
||||
|
||||
rootfs_format = ""
|
||||
if curr_package_format:
|
||||
rootfs_format = curr_package_format.split()[0]
|
||||
|
||||
rootfs_format_widget, rootfs_combo = self.gen_combo_widget(rootfs_format, all_package_format, tooltip_combo)
|
||||
pkgfmt_vbox.pack_start(rootfs_format_widget, expand=False, fill=False)
|
||||
|
||||
label = self.gen_label_widget("Additional package formats")
|
||||
pkgfmt_vbox.pack_start(label, expand=False, fill=False)
|
||||
|
||||
check_hbox = gtk.HBox(False, 12)
|
||||
pkgfmt_vbox.pack_start(check_hbox, expand=False, fill=False)
|
||||
for format in all_package_format:
|
||||
if format != rootfs_format:
|
||||
check_button = gtk.CheckButton(format)
|
||||
is_active = (format in curr_package_format.split())
|
||||
check_button.set_active(is_active)
|
||||
check_hbox.pack_start(check_button, expand=False, fill=False)
|
||||
|
||||
self.pkgfmt_info = HobInfoButton(tooltip_extra, self)
|
||||
check_hbox.pack_start(self.pkgfmt_info, expand=False, fill=False)
|
||||
|
||||
rootfs_combo.connect("changed", self.rootfs_combo_changed_cb, all_package_format, check_hbox)
|
||||
|
||||
pkgfmt_vbox.show_all()
|
||||
|
||||
return pkgfmt_vbox, rootfs_combo, check_hbox
|
||||
|
||||
def __init__(self, title, configuration, all_image_types,
|
||||
all_package_formats, all_distros, all_sdk_machines,
|
||||
max_threads, parent, flags, buttons=None):
|
||||
super(AdvancedSettingsDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
# class members from other objects
|
||||
# bitbake settings from Builder.Configuration
|
||||
self.configuration = configuration
|
||||
self.image_types = all_image_types
|
||||
self.all_package_formats = all_package_formats
|
||||
self.all_distros = all_distros[:]
|
||||
self.all_sdk_machines = all_sdk_machines
|
||||
self.max_threads = max_threads
|
||||
|
||||
# class members for internal use
|
||||
self.distro_combo = None
|
||||
self.dldir_text = None
|
||||
self.sstatedir_text = None
|
||||
self.sstatemirror_text = None
|
||||
self.bb_spinner = None
|
||||
self.pmake_spinner = None
|
||||
self.rootfs_size_spinner = None
|
||||
self.extra_size_spinner = None
|
||||
self.gplv3_checkbox = None
|
||||
self.toolchain_checkbox = None
|
||||
self.image_types_checkbuttons = {}
|
||||
|
||||
self.md5 = self.config_md5()
|
||||
self.settings_changed = False
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.save_button = None
|
||||
self.create_visual_elements()
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
def _get_sorted_value(self, var):
|
||||
return " ".join(sorted(str(var).split())) + "\n"
|
||||
|
||||
def config_md5(self):
|
||||
data = ""
|
||||
data += ("PACKAGE_CLASSES: " + self.configuration.curr_package_format + '\n')
|
||||
data += ("DISTRO: " + self._get_sorted_value(self.configuration.curr_distro))
|
||||
data += ("IMAGE_ROOTFS_SIZE: " + self._get_sorted_value(self.configuration.image_rootfs_size))
|
||||
data += ("IMAGE_EXTRA_SIZE: " + self._get_sorted_value(self.configuration.image_extra_size))
|
||||
data += ("INCOMPATIBLE_LICENSE: " + self._get_sorted_value(self.configuration.incompat_license))
|
||||
data += ("SDK_MACHINE: " + self._get_sorted_value(self.configuration.curr_sdk_machine))
|
||||
data += ("TOOLCHAIN_BUILD: " + self._get_sorted_value(self.configuration.toolchain_build))
|
||||
data += ("IMAGE_FSTYPES: " + self._get_sorted_value(self.configuration.image_fstypes))
|
||||
return hashlib.md5(data).hexdigest()
|
||||
|
||||
def create_visual_elements(self):
|
||||
self.nb = gtk.Notebook()
|
||||
self.nb.set_show_tabs(True)
|
||||
self.nb.append_page(self.create_image_types_page(), gtk.Label("Image types"))
|
||||
self.nb.append_page(self.create_output_page(), gtk.Label("Output"))
|
||||
self.nb.set_current_page(0)
|
||||
self.vbox.pack_start(self.nb, expand=True, fill=True)
|
||||
self.vbox.pack_end(gtk.HSeparator(), expand=True, fill=True)
|
||||
|
||||
self.show_all()
|
||||
|
||||
def get_num_checked_image_types(self):
|
||||
total = 0
|
||||
for b in self.image_types_checkbuttons.values():
|
||||
if b.get_active():
|
||||
total = total + 1
|
||||
return total
|
||||
|
||||
def set_save_button_state(self):
|
||||
if self.save_button:
|
||||
self.save_button.set_sensitive(self.get_num_checked_image_types() > 0)
|
||||
|
||||
def image_type_checkbutton_clicked_cb(self, button):
|
||||
self.set_save_button_state()
|
||||
if self.get_num_checked_image_types() == 0:
|
||||
# Show an error dialog
|
||||
lbl = "<b>Select an image type</b>\n\nYou need to select at least one image type."
|
||||
dialog = CrumbsMessageDialog(self, lbl, gtk.STOCK_DIALOG_WARNING)
|
||||
button = dialog.add_button("OK", gtk.RESPONSE_OK)
|
||||
HobButton.style_button(button)
|
||||
response = dialog.run()
|
||||
dialog.destroy()
|
||||
|
||||
def create_image_types_page(self):
|
||||
main_vbox = gtk.VBox(False, 16)
|
||||
main_vbox.set_border_width(6)
|
||||
|
||||
advanced_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.set_border_width(6)
|
||||
|
||||
distro_vbox = gtk.VBox(False, 6)
|
||||
label = self.gen_label_widget("Distro:")
|
||||
tooltip = "Selects the Yocto Project distribution you want"
|
||||
try:
|
||||
i = self.all_distros.index( "defaultsetup" )
|
||||
except ValueError:
|
||||
i = -1
|
||||
if i != -1:
|
||||
self.all_distros[ i ] = "Default"
|
||||
if self.configuration.curr_distro == "defaultsetup":
|
||||
self.configuration.curr_distro = "Default"
|
||||
distro_widget, self.distro_combo = self.gen_combo_widget(self.configuration.curr_distro, self.all_distros,"<b>Distro</b>" + "*" + tooltip)
|
||||
distro_vbox.pack_start(label, expand=False, fill=False)
|
||||
distro_vbox.pack_start(distro_widget, expand=False, fill=False)
|
||||
main_vbox.pack_start(distro_vbox, expand=False, fill=False)
|
||||
|
||||
|
||||
rows = (len(self.image_types)+1)/3
|
||||
table = gtk.Table(rows + 1, 10, True)
|
||||
advanced_vbox.pack_start(table, expand=False, fill=False)
|
||||
|
||||
tooltip = "Image file system types you want."
|
||||
info = HobInfoButton("<b>Image types</b>" + "*" + tooltip, self)
|
||||
label = self.gen_label_widget("Image types:")
|
||||
align = gtk.Alignment(0, 0.5, 0, 0)
|
||||
table.attach(align, 0, 4, 0, 1)
|
||||
align.add(label)
|
||||
table.attach(info, 4, 5, 0, 1)
|
||||
|
||||
i = 1
|
||||
j = 1
|
||||
for image_type in sorted(self.image_types):
|
||||
self.image_types_checkbuttons[image_type] = gtk.CheckButton(image_type)
|
||||
self.image_types_checkbuttons[image_type].connect("toggled", self.image_type_checkbutton_clicked_cb)
|
||||
article = ""
|
||||
if image_type.startswith(("a", "e", "i", "o", "u")):
|
||||
article = "n"
|
||||
self.image_types_checkbuttons[image_type].set_tooltip_text("Build a%s %s image" % (article, image_type))
|
||||
table.attach(self.image_types_checkbuttons[image_type], j - 1, j + 3, i, i + 1)
|
||||
if image_type in self.configuration.image_fstypes.split():
|
||||
self.image_types_checkbuttons[image_type].set_active(True)
|
||||
i += 1
|
||||
if i > rows:
|
||||
i = 1
|
||||
j = j + 4
|
||||
|
||||
main_vbox.pack_start(advanced_vbox, expand=False, fill=False)
|
||||
self.set_save_button_state()
|
||||
|
||||
return main_vbox
|
||||
|
||||
def create_output_page(self):
|
||||
advanced_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.set_border_width(6)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Package format</span>'), expand=False, fill=False)
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
tooltip_combo = "Selects the package format used to generate rootfs."
|
||||
tooltip_extra = "Selects extra package formats to build"
|
||||
pkgfmt_widget, self.rootfs_combo, self.check_hbox = self.gen_pkgfmt_widget(self.configuration.curr_package_format, self.all_package_formats,"<b>Root file system package format</b>" + "*" + tooltip_combo,"<b>Additional package formats</b>" + "*" + tooltip_extra)
|
||||
sub_vbox.pack_start(pkgfmt_widget, expand=False, fill=False)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Image size</span>'), expand=False, fill=False)
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("Image basic size (in MB)")
|
||||
tooltip = "Sets the basic size of your target image.\nThis is the basic size of your target image unless your selected package size exceeds this value or you select \'Image Extra Size\'."
|
||||
rootfs_size_widget, self.rootfs_size_spinner = self.gen_spinner_widget(int(self.configuration.image_rootfs_size*1.0/1024), 0, 65536,"<b>Image basic size</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(rootfs_size_widget, expand=False, fill=False)
|
||||
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("Additional free space (in MB)")
|
||||
tooltip = "Sets the extra free space of your target image.\nBy default, the system reserves 30% of your image size as free space. If your image contains zypper, it brings in 50MB more space. The maximum free space is 64GB."
|
||||
extra_size_widget, self.extra_size_spinner = self.gen_spinner_widget(int(self.configuration.image_extra_size*1.0/1024), 0, 65536,"<b>Additional free space</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(extra_size_widget, expand=False, fill=False)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Licensing</span>'), expand=False, fill=False)
|
||||
self.gplv3_checkbox = gtk.CheckButton("Exclude GPLv3 packages")
|
||||
self.gplv3_checkbox.set_tooltip_text("Check this box to prevent GPLv3 packages from being included in your image")
|
||||
if "GPLv3" in self.configuration.incompat_license.split():
|
||||
self.gplv3_checkbox.set_active(True)
|
||||
else:
|
||||
self.gplv3_checkbox.set_active(False)
|
||||
advanced_vbox.pack_start(self.gplv3_checkbox, expand=False, fill=False)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Toolchain</span>'), expand=False, fill=False)
|
||||
sub_hbox = gtk.HBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_hbox, expand=False, fill=False)
|
||||
self.toolchain_checkbox = gtk.CheckButton("Build toolchain")
|
||||
self.toolchain_checkbox.set_tooltip_text("Check this box to build the related toolchain with your image")
|
||||
self.toolchain_checkbox.set_active(self.configuration.toolchain_build)
|
||||
sub_hbox.pack_start(self.toolchain_checkbox, expand=False, fill=False)
|
||||
|
||||
tooltip = "Selects the host platform for which you want to run the toolchain"
|
||||
sdk_machine_widget, self.sdk_machine_combo = self.gen_combo_widget(self.configuration.curr_sdk_machine, self.all_sdk_machines,"<b>Build toolchain</b>" + "*" + tooltip)
|
||||
sub_hbox.pack_start(sdk_machine_widget, expand=False, fill=False)
|
||||
|
||||
return advanced_vbox
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
package_format = []
|
||||
package_format.append(self.rootfs_combo.get_active_text())
|
||||
for child in self.check_hbox:
|
||||
if isinstance(child, gtk.CheckButton) and child.get_active():
|
||||
package_format.append(child.get_label())
|
||||
self.configuration.curr_package_format = " ".join(package_format)
|
||||
|
||||
distro = self.distro_combo.get_active_text()
|
||||
if distro == "Default":
|
||||
distro = "defaultsetup"
|
||||
self.configuration.curr_distro = distro
|
||||
self.configuration.image_rootfs_size = self.rootfs_size_spinner.get_value_as_int() * 1024
|
||||
self.configuration.image_extra_size = self.extra_size_spinner.get_value_as_int() * 1024
|
||||
|
||||
self.configuration.image_fstypes = ""
|
||||
for image_type in self.image_types:
|
||||
if self.image_types_checkbuttons[image_type].get_active():
|
||||
self.configuration.image_fstypes += (" " + image_type)
|
||||
self.configuration.image_fstypes.strip()
|
||||
|
||||
if self.gplv3_checkbox.get_active():
|
||||
if "GPLv3" not in self.configuration.incompat_license.split():
|
||||
self.configuration.incompat_license += " GPLv3"
|
||||
else:
|
||||
if "GPLv3" in self.configuration.incompat_license.split():
|
||||
self.configuration.incompat_license = self.configuration.incompat_license.split().remove("GPLv3")
|
||||
self.configuration.incompat_license = " ".join(self.configuration.incompat_license or [])
|
||||
self.configuration.incompat_license = self.configuration.incompat_license.strip()
|
||||
|
||||
self.configuration.toolchain_build = self.toolchain_checkbox.get_active()
|
||||
self.configuration.curr_sdk_machine = self.sdk_machine_combo.get_active_text()
|
||||
md5 = self.config_md5()
|
||||
self.settings_changed = (self.md5 != md5)
|
||||
@@ -1,44 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class CrumbsDialog(gtk.Dialog):
|
||||
"""
|
||||
A GNOME HIG compliant dialog widget.
|
||||
Add buttons with gtk.Dialog.add_button or gtk.Dialog.add_buttons
|
||||
"""
|
||||
def __init__(self, title="", parent=None, flags=0, buttons=None):
|
||||
super(CrumbsDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
self.set_property("has-separator", False) # note: deprecated in 2.22
|
||||
|
||||
self.set_border_width(6)
|
||||
self.vbox.set_property("spacing", 12)
|
||||
self.action_area.set_property("spacing", 12)
|
||||
self.action_area.set_property("border-width", 6)
|
||||
@@ -1,95 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import glib
|
||||
import gtk
|
||||
from bb.ui.crumbs.hobwidget import HobIconChecker
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class CrumbsMessageDialog(CrumbsDialog):
|
||||
"""
|
||||
A GNOME HIG compliant dialog widget.
|
||||
Add buttons with gtk.Dialog.add_button or gtk.Dialog.add_buttons
|
||||
"""
|
||||
def __init__(self, parent=None, label="", icon=gtk.STOCK_INFO, msg=""):
|
||||
super(CrumbsMessageDialog, self).__init__("", parent, gtk.DIALOG_MODAL)
|
||||
|
||||
self.set_border_width(6)
|
||||
self.vbox.set_property("spacing", 12)
|
||||
self.action_area.set_property("spacing", 12)
|
||||
self.action_area.set_property("border-width", 6)
|
||||
|
||||
first_column = gtk.HBox(spacing=12)
|
||||
first_column.set_property("border-width", 6)
|
||||
first_column.show()
|
||||
self.vbox.add(first_column)
|
||||
|
||||
self.icon = gtk.Image()
|
||||
# We have our own Info icon which should be used in preference of the stock icon
|
||||
self.icon_chk = HobIconChecker()
|
||||
self.icon.set_from_stock(self.icon_chk.check_stock_icon(icon), gtk.ICON_SIZE_DIALOG)
|
||||
self.icon.set_property("yalign", 0.00)
|
||||
self.icon.show()
|
||||
first_column.pack_start(self.icon, expand=False, fill=True, padding=0)
|
||||
|
||||
if 0 <= len(msg) < 200:
|
||||
lbl = label + "%s" % glib.markup_escape_text(msg)
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_use_markup(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup(lbl)
|
||||
self.label_short.set_property("yalign", 0.00)
|
||||
self.label_short.show()
|
||||
first_column.add(self.label_short)
|
||||
else:
|
||||
second_row = gtk.VBox(spacing=12)
|
||||
second_row.set_property("border-width", 6)
|
||||
self.label_long = gtk.Label()
|
||||
self.label_long.set_use_markup(True)
|
||||
self.label_long.set_line_wrap(True)
|
||||
self.label_long.set_markup(label)
|
||||
self.label_long.set_alignment(0.0, 0.0)
|
||||
second_row.pack_start(self.label_long, expand=False, fill=False, padding=0)
|
||||
self.label_long.show()
|
||||
self.textWindow = gtk.ScrolledWindow()
|
||||
self.textWindow.set_shadow_type(gtk.SHADOW_IN)
|
||||
self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
|
||||
self.msgView = gtk.TextView()
|
||||
self.msgView.set_editable(False)
|
||||
self.msgView.set_wrap_mode(gtk.WRAP_WORD)
|
||||
self.msgView.set_cursor_visible(False)
|
||||
self.msgView.set_size_request(300, 300)
|
||||
self.buf = gtk.TextBuffer()
|
||||
self.buf.set_text(msg)
|
||||
self.msgView.set_buffer(self.buf)
|
||||
self.textWindow.add(self.msgView)
|
||||
self.msgView.show()
|
||||
second_row.add(self.textWindow)
|
||||
self.textWindow.show()
|
||||
first_column.add(second_row)
|
||||
second_row.show()
|
||||
@@ -1,215 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import glob
|
||||
import gtk
|
||||
import gobject
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import subprocess
|
||||
import tempfile
|
||||
from bb.ui.crumbs.hobwidget import hic, HobButton
|
||||
from bb.ui.crumbs.progressbar import HobProgressBar
|
||||
import bb.ui.crumbs.utils
|
||||
import bb.process
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class DeployImageDialog (CrumbsDialog):
|
||||
|
||||
__dummy_usb__ = "--select a usb drive--"
|
||||
|
||||
def __init__(self, title, image_path, parent, flags, buttons=None, standalone=False):
|
||||
super(DeployImageDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
self.image_path = image_path
|
||||
self.standalone = standalone
|
||||
|
||||
self.create_visual_elements()
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
def create_visual_elements(self):
|
||||
self.set_size_request(600, 400)
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0.0, 0.5)
|
||||
markup = "<span font_desc='12'>The image to be written into usb drive:</span>"
|
||||
label.set_markup(markup)
|
||||
self.vbox.pack_start(label, expand=False, fill=False, padding=2)
|
||||
|
||||
table = gtk.Table(2, 10, False)
|
||||
table.set_col_spacings(5)
|
||||
table.set_row_spacings(5)
|
||||
self.vbox.pack_start(table, expand=True, fill=True)
|
||||
|
||||
scroll = gtk.ScrolledWindow()
|
||||
scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
|
||||
scroll.set_shadow_type(gtk.SHADOW_IN)
|
||||
tv = gtk.TextView()
|
||||
tv.set_editable(False)
|
||||
tv.set_wrap_mode(gtk.WRAP_WORD)
|
||||
tv.set_cursor_visible(False)
|
||||
self.buf = gtk.TextBuffer()
|
||||
self.buf.set_text(self.image_path)
|
||||
tv.set_buffer(self.buf)
|
||||
scroll.add(tv)
|
||||
table.attach(scroll, 0, 10, 0, 1)
|
||||
|
||||
# There are 2 ways to use DeployImageDialog
|
||||
# One way is that called by HOB when the 'Deploy Image' button is clicked
|
||||
# The other way is that called by a standalone script.
|
||||
# Following block of codes handles the latter way. It adds a 'Select Image' button and
|
||||
# emit a signal when the button is clicked.
|
||||
if self.standalone:
|
||||
gobject.signal_new("select_image_clicked", self, gobject.SIGNAL_RUN_FIRST,
|
||||
gobject.TYPE_NONE, ())
|
||||
icon = gtk.Image()
|
||||
pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_IMAGES_DISPLAY_FILE)
|
||||
icon.set_from_pixbuf(pix_buffer)
|
||||
button = gtk.Button("Select Image")
|
||||
button.set_image(icon)
|
||||
#button.set_size_request(140, 50)
|
||||
table.attach(button, 9, 10, 1, 2, gtk.FILL, 0, 0, 0)
|
||||
button.connect("clicked", self.select_image_button_clicked_cb)
|
||||
|
||||
separator = gtk.HSeparator()
|
||||
self.vbox.pack_start(separator, expand=False, fill=False, padding=10)
|
||||
|
||||
self.usb_desc = gtk.Label()
|
||||
self.usb_desc.set_alignment(0.0, 0.5)
|
||||
markup = "<span font_desc='12'>You haven't chosen any USB drive.</span>"
|
||||
self.usb_desc.set_markup(markup)
|
||||
|
||||
self.usb_combo = gtk.combo_box_new_text()
|
||||
self.usb_combo.connect("changed", self.usb_combo_changed_cb)
|
||||
model = self.usb_combo.get_model()
|
||||
model.clear()
|
||||
self.usb_combo.append_text(self.__dummy_usb__)
|
||||
for usb in self.find_all_usb_devices():
|
||||
self.usb_combo.append_text("/dev/" + usb)
|
||||
self.usb_combo.set_active(0)
|
||||
self.vbox.pack_start(self.usb_combo, expand=False, fill=False)
|
||||
self.vbox.pack_start(self.usb_desc, expand=False, fill=False, padding=2)
|
||||
|
||||
self.progress_bar = HobProgressBar()
|
||||
self.vbox.pack_start(self.progress_bar, expand=False, fill=False)
|
||||
separator = gtk.HSeparator()
|
||||
self.vbox.pack_start(separator, expand=False, fill=True, padding=10)
|
||||
|
||||
self.vbox.show_all()
|
||||
self.progress_bar.hide()
|
||||
|
||||
def set_image_text_buffer(self, image_path):
|
||||
self.buf.set_text(image_path)
|
||||
|
||||
def set_image_path(self, image_path):
|
||||
self.image_path = image_path
|
||||
|
||||
def popen_read(self, cmd):
|
||||
tmpout, errors = bb.process.run("%s" % cmd)
|
||||
return tmpout.strip()
|
||||
|
||||
def find_all_usb_devices(self):
|
||||
usb_devs = [ os.readlink(u)
|
||||
for u in glob.glob('/dev/disk/by-id/usb*')
|
||||
if not re.search(r'part\d+', u) ]
|
||||
return [ '%s' % u[u.rfind('/')+1:] for u in usb_devs ]
|
||||
|
||||
def get_usb_info(self, dev):
|
||||
return "%s %s" % \
|
||||
(self.popen_read('cat /sys/class/block/%s/device/vendor' % dev),
|
||||
self.popen_read('cat /sys/class/block/%s/device/model' % dev))
|
||||
|
||||
def select_image_button_clicked_cb(self, button):
|
||||
self.emit('select_image_clicked')
|
||||
|
||||
def usb_combo_changed_cb(self, usb_combo):
|
||||
combo_item = self.usb_combo.get_active_text()
|
||||
if not combo_item or combo_item == self.__dummy_usb__:
|
||||
markup = "<span font_desc='12'>You haven't chosen any USB drive.</span>"
|
||||
self.usb_desc.set_markup(markup)
|
||||
else:
|
||||
markup = "<span font_desc='12'>" + self.get_usb_info(combo_item.lstrip("/dev/")) + "</span>"
|
||||
self.usb_desc.set_markup(markup)
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
if response_id == gtk.RESPONSE_YES:
|
||||
lbl = ''
|
||||
combo_item = self.usb_combo.get_active_text()
|
||||
if combo_item and combo_item != self.__dummy_usb__ and self.image_path:
|
||||
cmdline = bb.ui.crumbs.utils.which_terminal()
|
||||
if cmdline:
|
||||
tmpfile = tempfile.NamedTemporaryFile()
|
||||
cmdline += "\"sudo dd if=" + self.image_path + \
|
||||
" of=" + combo_item + "; echo $? > " + tmpfile.name + "\""
|
||||
subprocess.call(shlex.split(cmdline))
|
||||
|
||||
if int(tmpfile.readline().strip()) == 0:
|
||||
lbl = "<b>Deploy image successfully.</b>"
|
||||
else:
|
||||
lbl = "<b>Failed to deploy image.</b>\nPlease check image <b>%s</b> exists and USB device <b>%s</b> is writable." % (self.image_path, combo_item)
|
||||
tmpfile.close()
|
||||
else:
|
||||
if not self.image_path:
|
||||
lbl = "<b>No selection made.</b>\nYou have not selected an image to deploy."
|
||||
else:
|
||||
lbl = "<b>No selection made.</b>\nYou have not selected a USB device."
|
||||
if len(lbl):
|
||||
crumbs_dialog = CrumbsMessageDialog(self, lbl, gtk.STOCK_DIALOG_INFO)
|
||||
button = crumbs_dialog.add_button("Close", gtk.RESPONSE_OK)
|
||||
HobButton.style_button(button)
|
||||
crumbs_dialog.run()
|
||||
crumbs_dialog.destroy()
|
||||
|
||||
def update_progress_bar(self, title, fraction, status=None):
|
||||
self.progress_bar.update(fraction)
|
||||
self.progress_bar.set_title(title)
|
||||
self.progress_bar.set_rcstyle(status)
|
||||
|
||||
def write_file(self, ifile, ofile):
|
||||
self.progress_bar.reset()
|
||||
self.progress_bar.show()
|
||||
|
||||
f_from = os.open(ifile, os.O_RDONLY)
|
||||
f_to = os.open(ofile, os.O_WRONLY)
|
||||
|
||||
total_size = os.stat(ifile).st_size
|
||||
written_size = 0
|
||||
|
||||
while True:
|
||||
buf = os.read(f_from, 1024*1024)
|
||||
if not buf:
|
||||
break
|
||||
os.write(f_to, buf)
|
||||
written_size += 1024*1024
|
||||
self.update_progress_bar("Writing to usb:", written_size * 1.0/total_size)
|
||||
|
||||
self.update_progress_bar("Writing completed:", 1.0)
|
||||
os.close(f_from)
|
||||
os.close(f_to)
|
||||
self.progress_bar.hide()
|
||||
@@ -1,172 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import gobject
|
||||
import os
|
||||
from bb.ui.crumbs.hobwidget import HobViewTable, HobInfoButton, HobButton, HobAltButton
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class ImageSelectionDialog (CrumbsDialog):
|
||||
|
||||
__columns__ = [{
|
||||
'col_name' : 'Image name',
|
||||
'col_id' : 0,
|
||||
'col_style': 'text',
|
||||
'col_min' : 400,
|
||||
'col_max' : 400
|
||||
}, {
|
||||
'col_name' : 'Select',
|
||||
'col_id' : 1,
|
||||
'col_style': 'radio toggle',
|
||||
'col_min' : 160,
|
||||
'col_max' : 160
|
||||
}]
|
||||
|
||||
|
||||
def __init__(self, image_folder, image_types, title, parent, flags, buttons=None, image_extension = {}):
|
||||
super(ImageSelectionDialog, self).__init__(title, parent, flags, buttons)
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
self.image_folder = image_folder
|
||||
self.image_types = image_types
|
||||
self.image_list = []
|
||||
self.image_names = []
|
||||
self.image_extension = image_extension
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.create_visual_elements()
|
||||
|
||||
self.image_store = gtk.ListStore(gobject.TYPE_STRING, gobject.TYPE_BOOLEAN)
|
||||
self.fill_image_store()
|
||||
|
||||
def create_visual_elements(self):
|
||||
hbox = gtk.HBox(False, 6)
|
||||
|
||||
self.vbox.pack_start(hbox, expand=False, fill=False)
|
||||
|
||||
entry = gtk.Entry()
|
||||
entry.set_text(self.image_folder)
|
||||
table = gtk.Table(1, 10, True)
|
||||
table.set_size_request(560, -1)
|
||||
hbox.pack_start(table, expand=False, fill=False)
|
||||
table.attach(entry, 0, 9, 0, 1)
|
||||
image = gtk.Image()
|
||||
image.set_from_stock(gtk.STOCK_OPEN, gtk.ICON_SIZE_BUTTON)
|
||||
open_button = gtk.Button()
|
||||
open_button.set_image(image)
|
||||
open_button.connect("clicked", self.select_path_cb, self, entry)
|
||||
table.attach(open_button, 9, 10, 0, 1)
|
||||
|
||||
self.image_table = HobViewTable(self.__columns__, "Images")
|
||||
self.image_table.set_size_request(-1, 300)
|
||||
self.image_table.connect("toggled", self.toggled_cb)
|
||||
self.image_table.connect_group_selection(self.table_selected_cb)
|
||||
self.image_table.connect("row-activated", self.row_actived_cb)
|
||||
self.vbox.pack_start(self.image_table, expand=True, fill=True)
|
||||
|
||||
self.show_all()
|
||||
|
||||
def change_image_cb(self, model, path, columnid):
|
||||
if not model:
|
||||
return
|
||||
iter = model.get_iter_first()
|
||||
while iter:
|
||||
rowpath = model.get_path(iter)
|
||||
model[rowpath][columnid] = False
|
||||
iter = model.iter_next(iter)
|
||||
|
||||
model[path][columnid] = True
|
||||
|
||||
def toggled_cb(self, table, cell, path, columnid, tree):
|
||||
model = tree.get_model()
|
||||
self.change_image_cb(model, path, columnid)
|
||||
|
||||
def table_selected_cb(self, selection):
|
||||
model, paths = selection.get_selected_rows()
|
||||
if paths:
|
||||
self.change_image_cb(model, paths[0], 1)
|
||||
|
||||
def row_actived_cb(self, tab, model, path):
|
||||
self.change_image_cb(model, path, 1)
|
||||
self.emit('response', gtk.RESPONSE_YES)
|
||||
|
||||
def select_path_cb(self, action, parent, entry):
|
||||
dialog = gtk.FileChooserDialog("", parent,
|
||||
gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
|
||||
text = entry.get_text()
|
||||
dialog.set_current_folder(text if len(text) > 0 else os.getcwd())
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Open", gtk.RESPONSE_YES)
|
||||
HobButton.style_button(button)
|
||||
response = dialog.run()
|
||||
if response == gtk.RESPONSE_YES:
|
||||
path = dialog.get_filename()
|
||||
entry.set_text(path)
|
||||
self.image_folder = path
|
||||
self.fill_image_store()
|
||||
|
||||
dialog.destroy()
|
||||
|
||||
def fill_image_store(self):
|
||||
self.image_list = []
|
||||
self.image_store.clear()
|
||||
imageset = set()
|
||||
for root, dirs, files in os.walk(self.image_folder):
|
||||
# ignore the sub directories
|
||||
dirs[:] = []
|
||||
for f in files:
|
||||
for image_type in self.image_types:
|
||||
if image_type in self.image_extension:
|
||||
real_types = self.image_extension[image_type]
|
||||
else:
|
||||
real_types = [image_type]
|
||||
for real_image_type in real_types:
|
||||
if f.endswith('.' + real_image_type):
|
||||
imageset.add(f.rsplit('.' + real_image_type)[0].rsplit('.rootfs')[0])
|
||||
self.image_list.append(f)
|
||||
|
||||
for image in imageset:
|
||||
self.image_store.set(self.image_store.append(), 0, image, 1, False)
|
||||
|
||||
self.image_table.set_model(self.image_store)
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
self.image_names = []
|
||||
if response_id == gtk.RESPONSE_YES:
|
||||
iter = self.image_store.get_iter_first()
|
||||
while iter:
|
||||
path = self.image_store.get_path(iter)
|
||||
if self.image_store[path][1]:
|
||||
for f in self.image_list:
|
||||
if f.startswith(self.image_store[path][0] + '.'):
|
||||
self.image_names.append(f)
|
||||
break
|
||||
iter = self.image_store.iter_next(iter)
|
||||
@@ -1,297 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import gobject
|
||||
import os
|
||||
import tempfile
|
||||
from bb.ui.crumbs.hobwidget import hic, HobButton, HobAltButton
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class CellRendererPixbufActivatable(gtk.CellRendererPixbuf):
|
||||
"""
|
||||
A custom CellRenderer implementation which is activatable
|
||||
so that we can handle user clicks
|
||||
"""
|
||||
__gsignals__ = { 'clicked' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
(gobject.TYPE_STRING,)), }
|
||||
|
||||
def __init__(self):
|
||||
gtk.CellRendererPixbuf.__init__(self)
|
||||
self.set_property('mode', gtk.CELL_RENDERER_MODE_ACTIVATABLE)
|
||||
self.set_property('follow-state', True)
|
||||
|
||||
"""
|
||||
Respond to a user click on a cell
|
||||
"""
|
||||
def do_activate(self, even, widget, path, background_area, cell_area, flags):
|
||||
self.emit('clicked', path)
|
||||
|
||||
#
|
||||
# LayerSelectionDialog
|
||||
#
|
||||
class LayerSelectionDialog (CrumbsDialog):
|
||||
|
||||
TARGETS = [
|
||||
("MY_TREE_MODEL_ROW", gtk.TARGET_SAME_WIDGET, 0),
|
||||
("text/plain", 0, 1),
|
||||
("TEXT", 0, 2),
|
||||
("STRING", 0, 3),
|
||||
]
|
||||
|
||||
def gen_label_widget(self, content):
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0, 0)
|
||||
label.set_markup(content)
|
||||
label.show()
|
||||
return label
|
||||
|
||||
def layer_widget_toggled_cb(self, cell, path, layer_store):
|
||||
name = layer_store[path][0]
|
||||
toggle = not layer_store[path][1]
|
||||
layer_store[path][1] = toggle
|
||||
|
||||
def layer_widget_add_clicked_cb(self, action, layer_store, parent):
|
||||
dialog = gtk.FileChooserDialog("Add new layer", parent,
|
||||
gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Open", gtk.RESPONSE_YES)
|
||||
HobButton.style_button(button)
|
||||
label = gtk.Label("Select the layer you wish to add")
|
||||
label.show()
|
||||
dialog.set_extra_widget(label)
|
||||
response = dialog.run()
|
||||
path = dialog.get_filename()
|
||||
dialog.destroy()
|
||||
|
||||
lbl = "<b>Error</b>\nUnable to load layer <i>%s</i> because " % path
|
||||
if response == gtk.RESPONSE_YES:
|
||||
import os
|
||||
import os.path
|
||||
layers = []
|
||||
it = layer_store.get_iter_first()
|
||||
while it:
|
||||
layers.append(layer_store.get_value(it, 0))
|
||||
it = layer_store.iter_next(it)
|
||||
|
||||
if not path:
|
||||
lbl += "it is an invalid path."
|
||||
elif not os.path.exists(path+"/conf/layer.conf"):
|
||||
lbl += "there is no layer.conf inside the directory."
|
||||
elif path in layers:
|
||||
lbl += "it is already in loaded layers."
|
||||
else:
|
||||
layer_store.append([path])
|
||||
return
|
||||
dialog = CrumbsMessageDialog(parent, lbl)
|
||||
dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
|
||||
response = dialog.run()
|
||||
dialog.destroy()
|
||||
|
||||
def layer_widget_del_clicked_cb(self, action, tree_selection, layer_store):
|
||||
model, iter = tree_selection.get_selected()
|
||||
if iter:
|
||||
layer_store.remove(iter)
|
||||
|
||||
|
||||
def gen_layer_widget(self, layers, layers_avail, window, tooltip=""):
|
||||
hbox = gtk.HBox(False, 6)
|
||||
|
||||
layer_tv = gtk.TreeView()
|
||||
layer_tv.set_rules_hint(True)
|
||||
layer_tv.set_headers_visible(False)
|
||||
tree_selection = layer_tv.get_selection()
|
||||
tree_selection.set_mode(gtk.SELECTION_SINGLE)
|
||||
|
||||
# Allow enable drag and drop of rows including row move
|
||||
dnd_internal_target = ''
|
||||
dnd_targets = [(dnd_internal_target, gtk.TARGET_SAME_WIDGET, 0)]
|
||||
layer_tv.enable_model_drag_source( gtk.gdk.BUTTON1_MASK,
|
||||
dnd_targets,
|
||||
gtk.gdk.ACTION_MOVE)
|
||||
layer_tv.enable_model_drag_dest(dnd_targets,
|
||||
gtk.gdk.ACTION_MOVE)
|
||||
layer_tv.connect("drag_data_get", self.drag_data_get_cb)
|
||||
layer_tv.connect("drag_data_received", self.drag_data_received_cb)
|
||||
|
||||
col0= gtk.TreeViewColumn('Path')
|
||||
cell0 = gtk.CellRendererText()
|
||||
cell0.set_padding(5,2)
|
||||
col0.pack_start(cell0, True)
|
||||
col0.set_cell_data_func(cell0, self.draw_layer_path_cb)
|
||||
layer_tv.append_column(col0)
|
||||
|
||||
scroll = gtk.ScrolledWindow()
|
||||
scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
|
||||
scroll.set_shadow_type(gtk.SHADOW_IN)
|
||||
scroll.add(layer_tv)
|
||||
|
||||
table_layer = gtk.Table(2, 10, False)
|
||||
hbox.pack_start(table_layer, expand=True, fill=True)
|
||||
|
||||
table_layer.attach(scroll, 0, 10, 0, 1)
|
||||
|
||||
layer_store = gtk.ListStore(gobject.TYPE_STRING)
|
||||
for layer in layers:
|
||||
layer_store.append([layer])
|
||||
|
||||
col1 = gtk.TreeViewColumn('Enabled')
|
||||
layer_tv.append_column(col1)
|
||||
|
||||
cell1 = CellRendererPixbufActivatable()
|
||||
cell1.set_fixed_size(-1,35)
|
||||
cell1.connect("clicked", self.del_cell_clicked_cb, layer_store)
|
||||
col1.pack_start(cell1, True)
|
||||
col1.set_cell_data_func(cell1, self.draw_delete_button_cb, layer_tv)
|
||||
|
||||
add_button = gtk.Button()
|
||||
add_button.set_relief(gtk.RELIEF_NONE)
|
||||
box = gtk.HBox(False, 6)
|
||||
box.show()
|
||||
add_button.add(box)
|
||||
add_button.connect("enter-notify-event", self.add_hover_cb)
|
||||
add_button.connect("leave-notify-event", self.add_leave_cb)
|
||||
self.im = gtk.Image()
|
||||
self.im.set_from_file(hic.ICON_INDI_ADD_FILE)
|
||||
self.im.show()
|
||||
box.pack_start(self.im, expand=False, fill=False, padding=6)
|
||||
lbl = gtk.Label("Add layer")
|
||||
lbl.set_alignment(0.0, 0.5)
|
||||
lbl.show()
|
||||
box.pack_start(lbl, expand=True, fill=True, padding=6)
|
||||
add_button.connect("clicked", self.layer_widget_add_clicked_cb, layer_store, window)
|
||||
table_layer.attach(add_button, 0, 10, 1, 2, gtk.EXPAND | gtk.FILL, 0, 0, 6)
|
||||
layer_tv.set_model(layer_store)
|
||||
|
||||
hbox.show_all()
|
||||
|
||||
return hbox, layer_store
|
||||
|
||||
def drag_data_get_cb(self, treeview, context, selection, target_id, etime):
|
||||
treeselection = treeview.get_selection()
|
||||
model, iter = treeselection.get_selected()
|
||||
data = model.get_value(iter, 0)
|
||||
selection.set(selection.target, 8, data)
|
||||
|
||||
def drag_data_received_cb(self, treeview, context, x, y, selection, info, etime):
|
||||
model = treeview.get_model()
|
||||
data = selection.data
|
||||
drop_info = treeview.get_dest_row_at_pos(x, y)
|
||||
if drop_info:
|
||||
path, position = drop_info
|
||||
iter = model.get_iter(path)
|
||||
if (position == gtk.TREE_VIEW_DROP_BEFORE or position == gtk.TREE_VIEW_DROP_INTO_OR_BEFORE):
|
||||
model.insert_before(iter, [data])
|
||||
else:
|
||||
model.insert_after(iter, [data])
|
||||
else:
|
||||
model.append([data])
|
||||
if context.action == gtk.gdk.ACTION_MOVE:
|
||||
context.finish(True, True, etime)
|
||||
return
|
||||
|
||||
def add_hover_cb(self, button, event):
|
||||
self.im.set_from_file(hic.ICON_INDI_ADD_HOVER_FILE)
|
||||
|
||||
def add_leave_cb(self, button, event):
|
||||
self.im.set_from_file(hic.ICON_INDI_ADD_FILE)
|
||||
|
||||
def __init__(self, title, layers, layers_non_removable, all_layers, parent, flags, buttons=None):
|
||||
super(LayerSelectionDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
# class members from other objects
|
||||
self.layers = layers
|
||||
self.layers_non_removable = layers_non_removable
|
||||
self.all_layers = all_layers
|
||||
self.layers_changed = False
|
||||
|
||||
# icon for remove button in TreeView
|
||||
im = gtk.Image()
|
||||
im.set_from_file(hic.ICON_INDI_REMOVE_FILE)
|
||||
self.rem_icon = im.get_pixbuf()
|
||||
|
||||
# class members for internal use
|
||||
self.layer_store = None
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.create_visual_elements()
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
def create_visual_elements(self):
|
||||
layer_widget, self.layer_store = self.gen_layer_widget(self.layers, self.all_layers, self, None)
|
||||
layer_widget.set_size_request(450, 250)
|
||||
self.vbox.pack_start(layer_widget, expand=True, fill=True)
|
||||
self.show_all()
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
model = self.layer_store
|
||||
it = model.get_iter_first()
|
||||
layers = []
|
||||
while it:
|
||||
layers.append(model.get_value(it, 0))
|
||||
it = model.iter_next(it)
|
||||
|
||||
self.layers_changed = (self.layers != layers)
|
||||
self.layers = layers
|
||||
|
||||
"""
|
||||
A custom cell_data_func to draw a delete 'button' in the TreeView for layers
|
||||
other than the meta layer. The deletion of which is prevented so that the
|
||||
user can't shoot themselves in the foot too badly.
|
||||
"""
|
||||
def draw_delete_button_cb(self, col, cell, model, it, tv):
|
||||
path = model.get_value(it, 0)
|
||||
if path in self.layers_non_removable:
|
||||
cell.set_sensitive(False)
|
||||
cell.set_property('pixbuf', None)
|
||||
cell.set_property('mode', gtk.CELL_RENDERER_MODE_INERT)
|
||||
else:
|
||||
cell.set_property('pixbuf', self.rem_icon)
|
||||
cell.set_sensitive(True)
|
||||
cell.set_property('mode', gtk.CELL_RENDERER_MODE_ACTIVATABLE)
|
||||
|
||||
return True
|
||||
|
||||
"""
|
||||
A custom cell_data_func to write an extra message into the layer path cell
|
||||
for the meta layer. We should inform the user that they can't remove it for
|
||||
their own safety.
|
||||
"""
|
||||
def draw_layer_path_cb(self, col, cell, model, it):
|
||||
path = model.get_value(it, 0)
|
||||
if path in self.layers_non_removable:
|
||||
cell.set_property('markup', "<b>It cannot be removed</b>\n%s" % path)
|
||||
else:
|
||||
cell.set_property('text', path)
|
||||
|
||||
def del_cell_clicked_cb(self, cell, path, model):
|
||||
it = model.get_iter_from_string(path)
|
||||
model.remove(it)
|
||||
@@ -1,163 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Cristiana Voicu <cristiana.voicu@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import gobject
|
||||
from bb.ui.crumbs.hobwidget import HobAltButton
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
#
|
||||
# ParsingWarningsDialog
|
||||
#
|
||||
class ParsingWarningsDialog (CrumbsDialog):
|
||||
|
||||
def __init__(self, title, warnings, parent, flags, buttons=None):
|
||||
super(ParsingWarningsDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
self.warnings = warnings
|
||||
self.warning_on = 0
|
||||
self.warn_nb = len(warnings)
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.create_visual_elements()
|
||||
|
||||
def cancel_button_cb(self, button):
|
||||
self.destroy()
|
||||
|
||||
def previous_button_cb(self, button):
|
||||
self.warning_on = self.warning_on - 1
|
||||
self.refresh_components()
|
||||
|
||||
def next_button_cb(self, button):
|
||||
self.warning_on = self.warning_on + 1
|
||||
self.refresh_components()
|
||||
|
||||
def refresh_components(self):
|
||||
lbl = self.warnings[self.warning_on]
|
||||
#when the warning text has more than 400 chars, it uses a scroll bar
|
||||
if 0<= len(lbl) < 400:
|
||||
self.warning_label.set_size_request(320, 230)
|
||||
self.warning_label.set_use_markup(True)
|
||||
self.warning_label.set_line_wrap(True)
|
||||
self.warning_label.set_markup(lbl)
|
||||
self.warning_label.set_property("yalign", 0.00)
|
||||
else:
|
||||
self.textWindow.set_shadow_type(gtk.SHADOW_IN)
|
||||
self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
|
||||
self.msgView = gtk.TextView()
|
||||
self.msgView.set_editable(False)
|
||||
self.msgView.set_wrap_mode(gtk.WRAP_WORD)
|
||||
self.msgView.set_cursor_visible(False)
|
||||
self.msgView.set_size_request(320, 230)
|
||||
self.buf = gtk.TextBuffer()
|
||||
self.buf.set_text(lbl)
|
||||
self.msgView.set_buffer(self.buf)
|
||||
self.textWindow.add(self.msgView)
|
||||
self.msgView.show()
|
||||
|
||||
if self.warning_on==0:
|
||||
self.previous_button.set_sensitive(False)
|
||||
else:
|
||||
self.previous_button.set_sensitive(True)
|
||||
|
||||
if self.warning_on==self.warn_nb-1:
|
||||
self.next_button.set_sensitive(False)
|
||||
else:
|
||||
self.next_button.set_sensitive(True)
|
||||
|
||||
if self.warn_nb>1:
|
||||
self.heading = "Warning " + str(self.warning_on + 1) + " of " + str(self.warn_nb)
|
||||
self.heading_label.set_markup('<span weight="bold">%s</span>' % self.heading)
|
||||
else:
|
||||
self.heading = "Warning"
|
||||
self.heading_label.set_markup('<span weight="bold">%s</span>' % self.heading)
|
||||
|
||||
self.show_all()
|
||||
|
||||
if 0<= len(lbl) < 400:
|
||||
self.textWindow.hide()
|
||||
else:
|
||||
self.warning_label.hide()
|
||||
|
||||
def create_visual_elements(self):
|
||||
self.set_size_request(350, 350)
|
||||
self.heading_label = gtk.Label()
|
||||
self.heading_label.set_alignment(0, 0)
|
||||
self.warning_label = gtk.Label()
|
||||
self.warning_label.set_selectable(True)
|
||||
self.warning_label.set_alignment(0, 0)
|
||||
self.textWindow = gtk.ScrolledWindow()
|
||||
|
||||
table = gtk.Table(1, 10, False)
|
||||
|
||||
cancel_button = gtk.Button()
|
||||
cancel_button.set_label("Close")
|
||||
cancel_button.connect("clicked", self.cancel_button_cb)
|
||||
cancel_button.set_size_request(110, 30)
|
||||
|
||||
self.previous_button = gtk.Button()
|
||||
image1 = gtk.image_new_from_stock(gtk.STOCK_GO_BACK, gtk.ICON_SIZE_BUTTON)
|
||||
image1.show()
|
||||
box = gtk.HBox(False, 6)
|
||||
box.show()
|
||||
self.previous_button.add(box)
|
||||
lbl = gtk.Label("Previous")
|
||||
lbl.show()
|
||||
box.pack_start(image1, expand=False, fill=False, padding=3)
|
||||
box.pack_start(lbl, expand=True, fill=True, padding=3)
|
||||
self.previous_button.connect("clicked", self.previous_button_cb)
|
||||
self.previous_button.set_size_request(110, 30)
|
||||
|
||||
self.next_button = gtk.Button()
|
||||
image2 = gtk.image_new_from_stock(gtk.STOCK_GO_FORWARD, gtk.ICON_SIZE_BUTTON)
|
||||
image2.show()
|
||||
box = gtk.HBox(False, 6)
|
||||
box.show()
|
||||
self.next_button.add(box)
|
||||
lbl = gtk.Label("Next")
|
||||
lbl.show()
|
||||
box.pack_start(lbl, expand=True, fill=True, padding=3)
|
||||
box.pack_start(image2, expand=False, fill=False, padding=3)
|
||||
self.next_button.connect("clicked", self.next_button_cb)
|
||||
self.next_button.set_size_request(110, 30)
|
||||
|
||||
#when there more than one warning, we need "previous" and "next" button
|
||||
if self.warn_nb>1:
|
||||
self.vbox.pack_start(self.heading_label, expand=False, fill=False)
|
||||
self.vbox.pack_start(self.warning_label, expand=False, fill=False)
|
||||
self.vbox.pack_start(self.textWindow, expand=False, fill=False)
|
||||
table.attach(cancel_button, 6, 7, 0, 1, xoptions=gtk.SHRINK)
|
||||
table.attach(self.previous_button, 7, 8, 0, 1, xoptions=gtk.SHRINK)
|
||||
table.attach(self.next_button, 8, 9, 0, 1, xoptions=gtk.SHRINK)
|
||||
self.vbox.pack_end(table, expand=False, fill=False)
|
||||
else:
|
||||
self.vbox.pack_start(self.heading_label, expand=False, fill=False)
|
||||
self.vbox.pack_start(self.warning_label, expand=False, fill=False)
|
||||
self.vbox.pack_start(self.textWindow, expand=False, fill=False)
|
||||
cancel_button = self.add_button("Close", gtk.RESPONSE_CANCEL)
|
||||
HobAltButton.style_button(cancel_button)
|
||||
|
||||
self.refresh_components()
|
||||
@@ -1,437 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2013 Intel Corporation
|
||||
#
|
||||
# Authored by Andrei Dinu <andrei.adrianx.dinu@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import string
|
||||
import gtk
|
||||
import gobject
|
||||
import os
|
||||
import tempfile
|
||||
import glib
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class PropertyDialog(CrumbsDialog):
|
||||
|
||||
def __init__(self, title, parent, information, flags, buttons=None):
|
||||
|
||||
super(PropertyDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
self.properties = information
|
||||
|
||||
if len(self.properties) == 10:
|
||||
self.create_recipe_visual_elements()
|
||||
elif len(self.properties) == 5:
|
||||
self.create_package_visual_elements()
|
||||
else:
|
||||
self.create_information_visual_elements()
|
||||
|
||||
|
||||
def create_information_visual_elements(self):
|
||||
|
||||
HOB_ICON_BASE_DIR = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), ("icons/"))
|
||||
ICON_PACKAGES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('info/info_display.png'))
|
||||
|
||||
self.set_resizable(False)
|
||||
|
||||
self.table = gtk.Table(1,1,False)
|
||||
self.table.set_row_spacings(0)
|
||||
self.table.set_col_spacings(0)
|
||||
|
||||
self.image = gtk.Image()
|
||||
self.image.set_from_file(ICON_PACKAGES_DISPLAY_FILE)
|
||||
self.image.set_property("xalign",0)
|
||||
#self.vbox.add(self.image)
|
||||
|
||||
image_info = self.properties.split("*")[0]
|
||||
info = self.properties.split("*")[1]
|
||||
|
||||
vbox = gtk.VBox(True, spacing=30)
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_line_wrap(False)
|
||||
self.label_short.set_markup(image_info)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.info_label = gtk.Label()
|
||||
self.info_label.set_line_wrap(True)
|
||||
self.info_label.set_markup(info)
|
||||
self.info_label.set_property("yalign", 0.5)
|
||||
|
||||
self.table.attach(self.image, 0,1,0,1, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=5,ypadding=5)
|
||||
self.table.attach(self.label_short, 0,1,0,1, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=40,ypadding=5)
|
||||
self.table.attach(self.info_label, 0,1,1,2, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=40,ypadding=10)
|
||||
|
||||
self.vbox.add(self.table)
|
||||
|
||||
def treeViewTooltip( self, widget, e, tooltips, cell, emptyText="" ):
|
||||
try:
|
||||
(path,col,x,y) = widget.get_path_at_pos( int(e.x), int(e.y) )
|
||||
it = widget.get_model().get_iter(path)
|
||||
value = widget.get_model().get_value(it,cell)
|
||||
if value in self.tooltip_items:
|
||||
tooltips.set_tip(widget, self.tooltip_items[value])
|
||||
tooltips.enable()
|
||||
else:
|
||||
tooltips.set_tip(widget, emptyText)
|
||||
except:
|
||||
tooltips.set_tip(widget, emptyText)
|
||||
|
||||
|
||||
def create_package_visual_elements(self):
|
||||
|
||||
name = self.properties['name']
|
||||
binb = self.properties['binb']
|
||||
size = self.properties['size']
|
||||
recipe = self.properties['recipe']
|
||||
file_list = self.properties['files_list']
|
||||
|
||||
file_list = file_list.strip("{}'")
|
||||
files_temp = ''
|
||||
paths_temp = ''
|
||||
files_binb = []
|
||||
paths_binb = []
|
||||
|
||||
self.tooltip_items = {}
|
||||
|
||||
self.set_resizable(False)
|
||||
|
||||
#cleaning out the recipe variable
|
||||
recipe = recipe.split("+")[0]
|
||||
|
||||
vbox = gtk.VBox(True,spacing = 0)
|
||||
|
||||
###################################### NAME ROW + COL #################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Name: </span>" + name)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
###################################### SIZE ROW + COL ######################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Size: </span>" + size)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
##################################### RECIPE ROW + COL #########################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Recipe: </span>" + recipe)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
##################################### BINB ROW + COL #######################################
|
||||
|
||||
if binb != '':
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Brought in by: </span>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.label_info = gtk.Label()
|
||||
self.label_info.set_size_request(300,-1)
|
||||
self.label_info.set_selectable(True)
|
||||
self.label_info.set_line_wrap(True)
|
||||
self.label_info.set_markup(binb)
|
||||
self.label_info.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
self.vbox.add(self.label_info)
|
||||
|
||||
#################################### FILES BROUGHT BY PACKAGES ###################################
|
||||
|
||||
if file_list != '':
|
||||
|
||||
self.textWindow = gtk.ScrolledWindow()
|
||||
self.textWindow.set_shadow_type(gtk.SHADOW_IN)
|
||||
self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
|
||||
self.textWindow.set_size_request(100, 170)
|
||||
|
||||
sstatemirrors_store = gtk.ListStore(str)
|
||||
|
||||
self.sstatemirrors_tv = gtk.TreeView()
|
||||
self.sstatemirrors_tv.set_rules_hint(True)
|
||||
self.sstatemirrors_tv.set_headers_visible(True)
|
||||
self.textWindow.add(self.sstatemirrors_tv)
|
||||
|
||||
self.cell1 = gtk.CellRendererText()
|
||||
col1 = gtk.TreeViewColumn('Package files', self.cell1)
|
||||
col1.set_cell_data_func(self.cell1, self.regex_field)
|
||||
self.sstatemirrors_tv.append_column(col1)
|
||||
|
||||
for items in file_list.split(']'):
|
||||
if len(items) > 1:
|
||||
paths_temp = items.split(":")[0]
|
||||
paths_binb.append(paths_temp.strip(" ,'"))
|
||||
files_temp = items.split(":")[1]
|
||||
files_binb.append(files_temp.strip(" ['"))
|
||||
|
||||
unsorted_list = []
|
||||
|
||||
for items in range(len(paths_binb)):
|
||||
if len(files_binb[items]) > 1:
|
||||
for aduse in (files_binb[items].split(",")):
|
||||
unsorted_list.append(paths_binb[items].split(name)[len(paths_binb[items].split(name))-1] + '/' + aduse.strip(" '"))
|
||||
|
||||
|
||||
unsorted_list.sort()
|
||||
for items in unsorted_list:
|
||||
temp = items
|
||||
while len(items) > 35:
|
||||
items = items[:len(items)/2] + "" + items[len(items)/2+1:]
|
||||
if len(items) == 35:
|
||||
items = items[:len(items)/2] + "..." + items[len(items)/2+3:]
|
||||
self.tooltip_items[items] = temp
|
||||
|
||||
sstatemirrors_store.append([str(items)])
|
||||
|
||||
|
||||
self.sstatemirrors_tv.set_model(sstatemirrors_store)
|
||||
|
||||
tips = gtk.Tooltips()
|
||||
tips.set_tip(self.sstatemirrors_tv, "")
|
||||
self.sstatemirrors_tv.connect("motion-notify-event", self.treeViewTooltip, tips, 0)
|
||||
self.sstatemirrors_tv.set_events(gtk.gdk.POINTER_MOTION_MASK)
|
||||
|
||||
self.vbox.add(self.textWindow)
|
||||
|
||||
self.vbox.show_all()
|
||||
|
||||
|
||||
def regex_field(self, column, cell, model, iter):
|
||||
cell.set_property('text', model.get_value(iter, 0))
|
||||
return
|
||||
|
||||
|
||||
def create_recipe_visual_elements(self):
|
||||
|
||||
summary = self.properties['summary']
|
||||
name = self.properties['name']
|
||||
version = self.properties['version']
|
||||
revision = self.properties['revision']
|
||||
binb = self.properties['binb']
|
||||
group = self.properties['group']
|
||||
license = self.properties['license']
|
||||
homepage = self.properties['homepage']
|
||||
bugtracker = self.properties['bugtracker']
|
||||
description = self.properties['description']
|
||||
|
||||
self.set_resizable(False)
|
||||
|
||||
#cleaning out the version variable and also the summary
|
||||
version = version.split(":")[1]
|
||||
if len(version) > 30:
|
||||
version = version.split("+")[0]
|
||||
else:
|
||||
version = version.split("-")[0]
|
||||
license = license.replace("&" , "and")
|
||||
if (homepage == ''):
|
||||
homepage = 'unknown'
|
||||
if (bugtracker == ''):
|
||||
bugtracker = 'unknown'
|
||||
summary = summary.split("+")[0]
|
||||
|
||||
#calculating the rows needed for the table
|
||||
binb_items_count = len(binb.split(','))
|
||||
binb_items = binb.split(',')
|
||||
|
||||
vbox = gtk.VBox(True,spacing = 0)
|
||||
|
||||
######################################## SUMMARY LABEL #########################################
|
||||
|
||||
if summary != '':
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<b>" + summary + "</b>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.pack_start(self.label_short, expand=False, fill=False, padding=0)
|
||||
|
||||
########################################## NAME ROW + COL #######################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Name: </span>" + name)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
####################################### VERSION ROW + COL ####################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Version: </span>" + version)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
##################################### REVISION ROW + COL #####################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Revision: </span>" + revision)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
################################## GROUP ROW + COL ############################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Group: </span>" + group)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
################################# HOMEPAGE ROW + COL ############################################
|
||||
|
||||
if homepage != 'unknown':
|
||||
self.label_info = gtk.Label()
|
||||
self.label_info.set_selectable(True)
|
||||
self.label_info.set_line_wrap(True)
|
||||
if len(homepage) > 35:
|
||||
self.label_info.set_markup("<a href=\"" + homepage + "\">" + homepage[0:35] + "..." + "</a>")
|
||||
else:
|
||||
self.label_info.set_markup("<a href=\"" + homepage + "\">" + homepage[0:60] + "</a>")
|
||||
|
||||
self.label_info.set_property("xalign", 0)
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<b>Homepage: </b>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
self.vbox.add(self.label_info)
|
||||
|
||||
################################# BUGTRACKER ROW + COL ###########################################
|
||||
|
||||
if bugtracker != 'unknown':
|
||||
self.label_info = gtk.Label()
|
||||
self.label_info.set_selectable(True)
|
||||
self.label_info.set_line_wrap(True)
|
||||
if len(bugtracker) > 35:
|
||||
self.label_info.set_markup("<a href=\"" + bugtracker + "\">" + bugtracker[0:35] + "..." + "</a>")
|
||||
else:
|
||||
self.label_info.set_markup("<a href=\"" + bugtracker + "\">" + bugtracker[0:60] + "</a>")
|
||||
self.label_info.set_property("xalign", 0)
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<b>Bugtracker: </b>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
self.vbox.add(self.label_info)
|
||||
|
||||
################################# LICENSE ROW + COL ############################################
|
||||
|
||||
self.label_info = gtk.Label()
|
||||
self.label_info.set_size_request(300,-1)
|
||||
self.label_info.set_selectable(True)
|
||||
self.label_info.set_line_wrap(True)
|
||||
self.label_info.set_markup(license)
|
||||
self.label_info.set_property("xalign", 0)
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">License: </span>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
self.vbox.add(self.label_info)
|
||||
|
||||
################################### BINB ROW+COL #############################################
|
||||
|
||||
if binb != '':
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Brought in by: </span>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
|
||||
self.label_info = gtk.Label()
|
||||
self.label_info.set_size_request(300,-1)
|
||||
self.label_info.set_selectable(True)
|
||||
self.label_info.set_markup(binb)
|
||||
self.label_info.set_property("xalign", 0)
|
||||
self.label_info.set_line_wrap(True)
|
||||
|
||||
self.vbox.add(self.label_short)
|
||||
self.vbox.add(self.label_info)
|
||||
|
||||
################################ DESCRIPTION TAG ROW #################################################
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_markup("<span weight=\"bold\">Description </span>")
|
||||
self.label_short.set_property("xalign", 0)
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
################################ DESCRIPTION INFORMATION ROW ##########################################
|
||||
|
||||
hbox = gtk.HBox(True,spacing = 0)
|
||||
|
||||
self.label_short = gtk.Label()
|
||||
self.label_short.set_size_request(300,-1)
|
||||
self.label_short.set_selectable(True)
|
||||
self.label_short.set_text(description)
|
||||
self.label_short.set_line_wrap(True)
|
||||
self.label_short.set_property("xalign", 0)
|
||||
self.vbox.add(self.label_short)
|
||||
|
||||
self.vbox.show_all()
|
||||
@@ -1,90 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class ProxyDetailsDialog (CrumbsDialog):
|
||||
|
||||
def __init__(self, title, user, passwd, parent, flags, buttons=None):
|
||||
super(ProxyDetailsDialog, self).__init__(title, parent, flags, buttons)
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
self.auth = not (user == None or passwd == None or user == "")
|
||||
self.user = user or ""
|
||||
self.passwd = passwd or ""
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.create_visual_elements()
|
||||
|
||||
def create_visual_elements(self):
|
||||
self.auth_checkbox = gtk.CheckButton("Use authentication")
|
||||
self.auth_checkbox.set_tooltip_text("Check this box to set the username and the password")
|
||||
self.auth_checkbox.set_active(self.auth)
|
||||
self.auth_checkbox.connect("toggled", self.auth_checkbox_toggled_cb)
|
||||
self.vbox.pack_start(self.auth_checkbox, expand=False, fill=False)
|
||||
|
||||
hbox = gtk.HBox(False, 6)
|
||||
self.user_label = gtk.Label("Username:")
|
||||
self.user_text = gtk.Entry()
|
||||
self.user_text.set_text(self.user)
|
||||
hbox.pack_start(self.user_label, expand=False, fill=False)
|
||||
hbox.pack_end(self.user_text, expand=False, fill=False)
|
||||
self.vbox.pack_start(hbox, expand=False, fill=False)
|
||||
|
||||
hbox = gtk.HBox(False, 6)
|
||||
self.passwd_label = gtk.Label("Password:")
|
||||
self.passwd_text = gtk.Entry()
|
||||
self.passwd_text.set_text(self.passwd)
|
||||
hbox.pack_start(self.passwd_label, expand=False, fill=False)
|
||||
hbox.pack_end(self.passwd_text, expand=False, fill=False)
|
||||
self.vbox.pack_start(hbox, expand=False, fill=False)
|
||||
|
||||
self.refresh_auth_components()
|
||||
self.show_all()
|
||||
|
||||
def refresh_auth_components(self):
|
||||
self.user_label.set_sensitive(self.auth)
|
||||
self.user_text.set_editable(self.auth)
|
||||
self.user_text.set_sensitive(self.auth)
|
||||
self.passwd_label.set_sensitive(self.auth)
|
||||
self.passwd_text.set_editable(self.auth)
|
||||
self.passwd_text.set_sensitive(self.auth)
|
||||
|
||||
def auth_checkbox_toggled_cb(self, button):
|
||||
self.auth = self.auth_checkbox.get_active()
|
||||
self.refresh_auth_components()
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
if response_id == gtk.RESPONSE_OK:
|
||||
if self.auth:
|
||||
self.user = self.user_text.get_text()
|
||||
self.passwd = self.passwd_text.get_text()
|
||||
else:
|
||||
self.user = None
|
||||
self.passwd = None
|
||||
@@ -1,122 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import os
|
||||
from bb.ui.crumbs.hobwidget import HobInfoButton, HobButton, HobAltButton
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class SettingsUIHelper():
|
||||
|
||||
def gen_label_widget(self, content):
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0, 0)
|
||||
label.set_markup(content)
|
||||
label.show()
|
||||
return label
|
||||
|
||||
def gen_label_info_widget(self, content, tooltip):
|
||||
table = gtk.Table(1, 10, False)
|
||||
label = self.gen_label_widget(content)
|
||||
info = HobInfoButton(tooltip, self)
|
||||
table.attach(label, 0, 1, 0, 1, xoptions=gtk.FILL)
|
||||
table.attach(info, 1, 2, 0, 1, xoptions=gtk.FILL, xpadding=10)
|
||||
return table
|
||||
|
||||
def gen_spinner_widget(self, content, lower, upper, tooltip=""):
|
||||
hbox = gtk.HBox(False, 12)
|
||||
adjust = gtk.Adjustment(value=content, lower=lower, upper=upper, step_incr=1)
|
||||
spinner = gtk.SpinButton(adjustment=adjust, climb_rate=1, digits=0)
|
||||
|
||||
spinner.set_value(content)
|
||||
hbox.pack_start(spinner, expand=False, fill=False)
|
||||
|
||||
info = HobInfoButton(tooltip, self)
|
||||
hbox.pack_start(info, expand=False, fill=False)
|
||||
|
||||
hbox.show_all()
|
||||
return hbox, spinner
|
||||
|
||||
def gen_combo_widget(self, curr_item, all_item, tooltip=""):
|
||||
hbox = gtk.HBox(False, 12)
|
||||
combo = gtk.combo_box_new_text()
|
||||
hbox.pack_start(combo, expand=False, fill=False)
|
||||
|
||||
index = 0
|
||||
for item in all_item or []:
|
||||
combo.append_text(item)
|
||||
if item == curr_item:
|
||||
combo.set_active(index)
|
||||
index += 1
|
||||
|
||||
info = HobInfoButton(tooltip, self)
|
||||
hbox.pack_start(info, expand=False, fill=False)
|
||||
|
||||
hbox.show_all()
|
||||
return hbox, combo
|
||||
|
||||
def entry_widget_select_path_cb(self, action, parent, entry):
|
||||
dialog = gtk.FileChooserDialog("", parent,
|
||||
gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
|
||||
text = entry.get_text()
|
||||
dialog.set_current_folder(text if len(text) > 0 else os.getcwd())
|
||||
button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
|
||||
HobAltButton.style_button(button)
|
||||
button = dialog.add_button("Open", gtk.RESPONSE_YES)
|
||||
HobButton.style_button(button)
|
||||
response = dialog.run()
|
||||
if response == gtk.RESPONSE_YES:
|
||||
path = dialog.get_filename()
|
||||
entry.set_text(path)
|
||||
|
||||
dialog.destroy()
|
||||
|
||||
def gen_entry_widget(self, content, parent, tooltip="", need_button=True):
|
||||
hbox = gtk.HBox(False, 12)
|
||||
entry = gtk.Entry()
|
||||
entry.set_text(content)
|
||||
entry.set_size_request(350,30)
|
||||
|
||||
if need_button:
|
||||
table = gtk.Table(1, 10, False)
|
||||
hbox.pack_start(table, expand=True, fill=True)
|
||||
table.attach(entry, 0, 9, 0, 1, xoptions=gtk.SHRINK)
|
||||
image = gtk.Image()
|
||||
image.set_from_stock(gtk.STOCK_OPEN,gtk.ICON_SIZE_BUTTON)
|
||||
open_button = gtk.Button()
|
||||
open_button.set_image(image)
|
||||
open_button.connect("clicked", self.entry_widget_select_path_cb, parent, entry)
|
||||
table.attach(open_button, 9, 10, 0, 1, xoptions=gtk.SHRINK)
|
||||
else:
|
||||
hbox.pack_start(entry, expand=True, fill=True)
|
||||
|
||||
if tooltip != "":
|
||||
info = HobInfoButton(tooltip, self)
|
||||
hbox.pack_start(info, expand=False, fill=False)
|
||||
|
||||
hbox.show_all()
|
||||
return hbox, entry
|
||||
@@ -1,900 +0,0 @@
|
||||
#
|
||||
# BitBake Graphical GTK User Interface
|
||||
#
|
||||
# Copyright (C) 2011-2012 Intel Corporation
|
||||
#
|
||||
# Authored by Joshua Lock <josh@linux.intel.com>
|
||||
# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
|
||||
# Authored by Shane Wang <shane.wang@intel.com>
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License version 2 as
|
||||
# published by the Free Software Foundation.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License along
|
||||
# with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
import gtk
|
||||
import gobject
|
||||
import hashlib
|
||||
from bb.ui.crumbs.hobwidget import hic, HobInfoButton, HobButton, HobAltButton
|
||||
from bb.ui.crumbs.progressbar import HobProgressBar
|
||||
from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
|
||||
from bb.ui.crumbs.hig.proxydetailsdialog import ProxyDetailsDialog
|
||||
|
||||
"""
|
||||
The following are convenience classes for implementing GNOME HIG compliant
|
||||
BitBake GUI's
|
||||
In summary: spacing = 12px, border-width = 6px
|
||||
"""
|
||||
|
||||
class SimpleSettingsDialog (CrumbsDialog, SettingsUIHelper):
|
||||
|
||||
(BUILD_ENV_PAGE_ID,
|
||||
SHARED_STATE_PAGE_ID,
|
||||
PROXIES_PAGE_ID,
|
||||
OTHERS_PAGE_ID) = range(4)
|
||||
|
||||
(TEST_NETWORK_NONE,
|
||||
TEST_NETWORK_INITIAL,
|
||||
TEST_NETWORK_RUNNING,
|
||||
TEST_NETWORK_PASSED,
|
||||
TEST_NETWORK_FAILED,
|
||||
TEST_NETWORK_CANCELED) = range(6)
|
||||
|
||||
TARGETS = [
|
||||
("MY_TREE_MODEL_ROW", gtk.TARGET_SAME_WIDGET, 0),
|
||||
("text/plain", 0, 1),
|
||||
("TEXT", 0, 2),
|
||||
("STRING", 0, 3),
|
||||
]
|
||||
|
||||
def __init__(self, title, configuration, all_image_types,
|
||||
all_package_formats, all_distros, all_sdk_machines,
|
||||
max_threads, parent, flags, handler, buttons=None):
|
||||
super(SimpleSettingsDialog, self).__init__(title, parent, flags, buttons)
|
||||
|
||||
# class members from other objects
|
||||
# bitbake settings from Builder.Configuration
|
||||
self.configuration = configuration
|
||||
self.image_types = all_image_types
|
||||
self.all_package_formats = all_package_formats
|
||||
self.all_distros = all_distros
|
||||
self.all_sdk_machines = all_sdk_machines
|
||||
self.max_threads = max_threads
|
||||
|
||||
# class members for internal use
|
||||
self.dldir_text = None
|
||||
self.sstatedir_text = None
|
||||
self.sstatemirrors_list = []
|
||||
self.sstatemirrors_changed = 0
|
||||
self.bb_spinner = None
|
||||
self.pmake_spinner = None
|
||||
self.rootfs_size_spinner = None
|
||||
self.extra_size_spinner = None
|
||||
self.gplv3_checkbox = None
|
||||
self.toolchain_checkbox = None
|
||||
self.setting_store = None
|
||||
self.image_types_checkbuttons = {}
|
||||
|
||||
self.md5 = self.config_md5()
|
||||
self.proxy_md5 = self.config_proxy_md5()
|
||||
self.settings_changed = False
|
||||
self.proxy_settings_changed = False
|
||||
self.handler = handler
|
||||
self.proxy_test_ran = False
|
||||
self.selected_mirror_row = 0
|
||||
self.new_mirror = False
|
||||
|
||||
# create visual elements on the dialog
|
||||
self.create_visual_elements()
|
||||
self.connect("response", self.response_cb)
|
||||
|
||||
def _get_sorted_value(self, var):
|
||||
return " ".join(sorted(str(var).split())) + "\n"
|
||||
|
||||
def config_proxy_md5(self):
|
||||
data = ("ENABLE_PROXY: " + self._get_sorted_value(self.configuration.enable_proxy))
|
||||
if self.configuration.enable_proxy:
|
||||
for protocol in self.configuration.proxies.keys():
|
||||
data += (protocol + ": " + self._get_sorted_value(self.configuration.combine_proxy(protocol)))
|
||||
return hashlib.md5(data).hexdigest()
|
||||
|
||||
def config_md5(self):
|
||||
data = ""
|
||||
for key in self.configuration.extra_setting.keys():
|
||||
data += (key + ": " + self._get_sorted_value(self.configuration.extra_setting[key]))
|
||||
return hashlib.md5(data).hexdigest()
|
||||
|
||||
def gen_proxy_entry_widget(self, protocol, parent, need_button=True, line=0):
|
||||
label = gtk.Label(protocol.upper() + " proxy")
|
||||
self.proxy_table.attach(label, 0, 1, line, line+1, xpadding=24)
|
||||
|
||||
proxy_entry = gtk.Entry()
|
||||
proxy_entry.set_size_request(300, -1)
|
||||
self.proxy_table.attach(proxy_entry, 1, 2, line, line+1, ypadding=4)
|
||||
|
||||
self.proxy_table.attach(gtk.Label(":"), 2, 3, line, line+1, xpadding=12, ypadding=4)
|
||||
|
||||
port_entry = gtk.Entry()
|
||||
port_entry.set_size_request(60, -1)
|
||||
self.proxy_table.attach(port_entry, 3, 4, line, line+1, ypadding=4)
|
||||
|
||||
details_button = HobAltButton("Details")
|
||||
details_button.connect("clicked", self.details_cb, parent, protocol)
|
||||
self.proxy_table.attach(details_button, 4, 5, line, line+1, xpadding=4, yoptions=gtk.EXPAND)
|
||||
|
||||
return proxy_entry, port_entry, details_button
|
||||
|
||||
def refresh_proxy_components(self):
|
||||
self.same_checkbox.set_sensitive(self.configuration.enable_proxy)
|
||||
|
||||
self.http_proxy.set_text(self.configuration.combine_host_only("http"))
|
||||
self.http_proxy.set_editable(self.configuration.enable_proxy)
|
||||
self.http_proxy.set_sensitive(self.configuration.enable_proxy)
|
||||
self.http_proxy_port.set_text(self.configuration.combine_port_only("http"))
|
||||
self.http_proxy_port.set_editable(self.configuration.enable_proxy)
|
||||
self.http_proxy_port.set_sensitive(self.configuration.enable_proxy)
|
||||
self.http_proxy_details.set_sensitive(self.configuration.enable_proxy)
|
||||
|
||||
self.https_proxy.set_text(self.configuration.combine_host_only("https"))
|
||||
self.https_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.https_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.https_proxy_port.set_text(self.configuration.combine_port_only("https"))
|
||||
self.https_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.https_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.https_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
|
||||
self.ftp_proxy.set_text(self.configuration.combine_host_only("ftp"))
|
||||
self.ftp_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.ftp_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.ftp_proxy_port.set_text(self.configuration.combine_port_only("ftp"))
|
||||
self.ftp_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.ftp_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.ftp_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
|
||||
self.socks_proxy.set_text(self.configuration.combine_host_only("socks"))
|
||||
self.socks_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.socks_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.socks_proxy_port.set_text(self.configuration.combine_port_only("socks"))
|
||||
self.socks_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.socks_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.socks_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
|
||||
self.cvs_proxy.set_text(self.configuration.combine_host_only("cvs"))
|
||||
self.cvs_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.cvs_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.cvs_proxy_port.set_text(self.configuration.combine_port_only("cvs"))
|
||||
self.cvs_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.cvs_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
self.cvs_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
|
||||
|
||||
if self.configuration.same_proxy:
|
||||
if self.http_proxy.get_text():
|
||||
[w.set_text(self.http_proxy.get_text()) for w in self.same_proxy_addresses]
|
||||
if self.http_proxy_port.get_text():
|
||||
[w.set_text(self.http_proxy_port.get_text()) for w in self.same_proxy_ports]
|
||||
|
||||
def proxy_checkbox_toggled_cb(self, button):
|
||||
self.configuration.enable_proxy = self.proxy_checkbox.get_active()
|
||||
if not self.configuration.enable_proxy:
|
||||
self.configuration.same_proxy = False
|
||||
self.same_checkbox.set_active(self.configuration.same_proxy)
|
||||
self.save_proxy_data()
|
||||
self.refresh_proxy_components()
|
||||
|
||||
def same_checkbox_toggled_cb(self, button):
|
||||
self.configuration.same_proxy = self.same_checkbox.get_active()
|
||||
self.save_proxy_data()
|
||||
self.refresh_proxy_components()
|
||||
|
||||
def save_proxy_data(self):
|
||||
self.configuration.split_proxy("http", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
|
||||
if self.configuration.same_proxy:
|
||||
self.configuration.split_proxy("https", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
|
||||
self.configuration.split_proxy("ftp", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
|
||||
self.configuration.split_proxy("socks", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
|
||||
self.configuration.split_proxy("cvs", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
|
||||
else:
|
||||
self.configuration.split_proxy("https", self.https_proxy.get_text() + ":" + self.https_proxy_port.get_text())
|
||||
self.configuration.split_proxy("ftp", self.ftp_proxy.get_text() + ":" + self.ftp_proxy_port.get_text())
|
||||
self.configuration.split_proxy("socks", self.socks_proxy.get_text() + ":" + self.socks_proxy_port.get_text())
|
||||
self.configuration.split_proxy("cvs", self.cvs_proxy.get_text() + ":" + self.cvs_proxy_port.get_text())
|
||||
|
||||
def response_cb(self, dialog, response_id):
|
||||
if response_id == gtk.RESPONSE_YES:
|
||||
# Check that all proxy entries have a corresponding port
|
||||
for proxy, port in zip(self.all_proxy_addresses, self.all_proxy_ports):
|
||||
if proxy.get_text() and not port.get_text():
|
||||
lbl = "<b>Enter all port numbers</b>\n\n"
|
||||
msg = "Proxy servers require a port number. Please make sure you have entered a port number for each proxy server."
|
||||
dialog = CrumbsMessageDialog(self, lbl, gtk.STOCK_DIALOG_WARNING, msg)
|
||||
button = dialog.add_button("Close", gtk.RESPONSE_OK)
|
||||
HobButton.style_button(button)
|
||||
response = dialog.run()
|
||||
dialog.destroy()
|
||||
self.emit_stop_by_name("response")
|
||||
return
|
||||
|
||||
self.configuration.dldir = self.dldir_text.get_text()
|
||||
self.configuration.sstatedir = self.sstatedir_text.get_text()
|
||||
self.configuration.sstatemirror = ""
|
||||
for mirror in self.sstatemirrors_list:
|
||||
if mirror[1] != "" and mirror[2].startswith("file://"):
|
||||
if mirror[1].endswith("\\1"):
|
||||
smirror = mirror[2] + " " + mirror[1] + " \\n "
|
||||
else:
|
||||
smirror = mirror[2] + " " + mirror[1] + "\\1 \\n "
|
||||
self.configuration.sstatemirror += smirror
|
||||
self.configuration.bbthread = self.bb_spinner.get_value_as_int()
|
||||
self.configuration.pmake = self.pmake_spinner.get_value_as_int()
|
||||
self.save_proxy_data()
|
||||
self.configuration.extra_setting = {}
|
||||
it = self.setting_store.get_iter_first()
|
||||
while it:
|
||||
key = self.setting_store.get_value(it, 0)
|
||||
value = self.setting_store.get_value(it, 1)
|
||||
self.configuration.extra_setting[key] = value
|
||||
it = self.setting_store.iter_next(it)
|
||||
|
||||
md5 = self.config_md5()
|
||||
self.settings_changed = (self.md5 != md5)
|
||||
self.proxy_settings_changed = (self.proxy_md5 != self.config_proxy_md5())
|
||||
|
||||
def create_build_environment_page(self):
|
||||
advanced_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.set_border_width(6)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Parallel threads</span>'), expand=False, fill=False)
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("BitBake parallel threads")
|
||||
tooltip = "Sets the number of threads that BitBake tasks can simultaneously run. See the <a href=\""
|
||||
tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
|
||||
tooltip += "poky-ref-manual.html#var-BB_NUMBER_THREADS\">Poky reference manual</a> for information"
|
||||
bbthread_widget, self.bb_spinner = self.gen_spinner_widget(self.configuration.bbthread, 1, self.max_threads,"<b>BitBake prallalel threads</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(bbthread_widget, expand=False, fill=False)
|
||||
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("Make parallel threads")
|
||||
tooltip = "Sets the maximum number of threads the host can use during the build. See the <a href=\""
|
||||
tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
|
||||
tooltip += "poky-ref-manual.html#var-PARALLEL_MAKE\">Poky reference manual</a> for information"
|
||||
pmake_widget, self.pmake_spinner = self.gen_spinner_widget(self.configuration.pmake, 1, self.max_threads,"<b>Make parallel threads</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(pmake_widget, expand=False, fill=False)
|
||||
|
||||
advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Downloaded source code</span>'), expand=False, fill=False)
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("Downloads directory")
|
||||
tooltip = "Select a folder that caches the upstream project source code"
|
||||
dldir_widget, self.dldir_text = self.gen_entry_widget(self.configuration.dldir, self,"<b>Downloaded source code</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(dldir_widget, expand=False, fill=False)
|
||||
|
||||
return advanced_vbox
|
||||
|
||||
def create_shared_state_page(self):
|
||||
advanced_vbox = gtk.VBox(False)
|
||||
advanced_vbox.set_border_width(12)
|
||||
|
||||
sub_vbox = gtk.VBox(False)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False, padding=24)
|
||||
content = "<span>Shared state directory</span>"
|
||||
tooltip = "Select a folder that caches your prebuilt results"
|
||||
label = self.gen_label_info_widget(content,"<b>Shared state directory</b>" + "*" + tooltip)
|
||||
sstatedir_widget, self.sstatedir_text = self.gen_entry_widget(self.configuration.sstatedir, self)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(sstatedir_widget, expand=False, fill=False, padding=6)
|
||||
|
||||
content = "<span weight=\"bold\">Shared state mirrors</span>"
|
||||
tooltip = "URLs pointing to pre-built mirrors that will speed your build. "
|
||||
tooltip += "Select the \'Standard\' configuration if the structure of your "
|
||||
tooltip += "mirror replicates the structure of your local shared state directory. "
|
||||
tooltip += "For more information on shared state mirrors, check the <a href=\""
|
||||
tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
|
||||
tooltip += "poky-ref-manual.html#shared-state\">Yocto Project Reference Manual</a>."
|
||||
table = self.gen_label_info_widget(content,"<b>Shared state mirrors</b>" + "*" + tooltip)
|
||||
advanced_vbox.pack_start(table, expand=False, fill=False, padding=6)
|
||||
|
||||
sub_vbox = gtk.VBox(False)
|
||||
advanced_vbox.pack_start(sub_vbox, gtk.TRUE, gtk.TRUE, 0)
|
||||
searched_string = "file://"
|
||||
|
||||
if self.sstatemirrors_changed == 0:
|
||||
self.sstatemirrors_changed = 1
|
||||
sstatemirrors = self.configuration.sstatemirror
|
||||
if sstatemirrors == "":
|
||||
sm_list = ["Standard", "", "file://(.*)"]
|
||||
self.sstatemirrors_list.append(sm_list)
|
||||
else:
|
||||
while sstatemirrors.find(searched_string) != -1:
|
||||
if sstatemirrors.find(searched_string,1) != -1:
|
||||
sstatemirror = sstatemirrors[:sstatemirrors.find(searched_string,1)]
|
||||
sstatemirrors = sstatemirrors[sstatemirrors.find(searched_string,1):]
|
||||
else:
|
||||
sstatemirror = sstatemirrors
|
||||
sstatemirrors = sstatemirrors[1:]
|
||||
|
||||
sstatemirror_fields = [x for x in sstatemirror.split(' ') if x.strip()]
|
||||
if len(sstatemirror_fields):
|
||||
if sstatemirror_fields[0] == "file://(.*)":
|
||||
sm_list = ["Standard", sstatemirror_fields[1], "file://(.*)"]
|
||||
else:
|
||||
sm_list = ["Custom", sstatemirror_fields[1], sstatemirror_fields[0]]
|
||||
self.sstatemirrors_list.append(sm_list)
|
||||
|
||||
sstatemirrors_widget, sstatemirrors_store = self.gen_shared_sstate_widget(self.sstatemirrors_list, self)
|
||||
sub_vbox.pack_start(sstatemirrors_widget, expand=True, fill=True)
|
||||
|
||||
table = gtk.Table(1, 10, False)
|
||||
table.set_col_spacings(6)
|
||||
add_mirror_button = HobAltButton("Add mirror")
|
||||
add_mirror_button.connect("clicked", self.add_mirror)
|
||||
add_mirror_button.set_size_request(120,30)
|
||||
table.attach(add_mirror_button, 1, 2, 0, 1, xoptions=gtk.SHRINK)
|
||||
|
||||
self.delete_button = HobAltButton("Delete mirror")
|
||||
self.delete_button.connect("clicked", self.delete_cb)
|
||||
self.delete_button.set_size_request(120, 30)
|
||||
table.attach(self.delete_button, 3, 4, 0, 1, xoptions=gtk.SHRINK)
|
||||
|
||||
advanced_vbox.pack_start(table, expand=False, fill=False, padding=6)
|
||||
|
||||
return advanced_vbox
|
||||
|
||||
def gen_shared_sstate_widget(self, sstatemirrors_list, window):
|
||||
hbox = gtk.HBox(False)
|
||||
|
||||
sstatemirrors_store = gtk.ListStore(str, str, str)
|
||||
for sstatemirror in sstatemirrors_list:
|
||||
sstatemirrors_store.append(sstatemirror)
|
||||
|
||||
self.sstatemirrors_tv = gtk.TreeView()
|
||||
self.sstatemirrors_tv.set_rules_hint(True)
|
||||
self.sstatemirrors_tv.set_headers_visible(True)
|
||||
tree_selection = self.sstatemirrors_tv.get_selection()
|
||||
tree_selection.set_mode(gtk.SELECTION_SINGLE)
|
||||
|
||||
# Allow enable drag and drop of rows including row move
|
||||
self.sstatemirrors_tv.enable_model_drag_source( gtk.gdk.BUTTON1_MASK,
|
||||
self.TARGETS,
|
||||
gtk.gdk.ACTION_DEFAULT|
|
||||
gtk.gdk.ACTION_MOVE)
|
||||
self.sstatemirrors_tv.enable_model_drag_dest(self.TARGETS,
|
||||
gtk.gdk.ACTION_DEFAULT)
|
||||
self.sstatemirrors_tv.connect("drag_data_get", self.drag_data_get_cb)
|
||||
self.sstatemirrors_tv.connect("drag_data_received", self.drag_data_received_cb)
|
||||
|
||||
|
||||
self.scroll = gtk.ScrolledWindow()
|
||||
self.scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
|
||||
self.scroll.set_shadow_type(gtk.SHADOW_IN)
|
||||
self.scroll.connect('size-allocate', self.scroll_changed)
|
||||
self.scroll.add(self.sstatemirrors_tv)
|
||||
|
||||
#list store for cell renderer
|
||||
m = gtk.ListStore(gobject.TYPE_STRING)
|
||||
m.append(["Standard"])
|
||||
m.append(["Custom"])
|
||||
|
||||
cell0 = gtk.CellRendererCombo()
|
||||
cell0.set_property("model",m)
|
||||
cell0.set_property("text-column", 0)
|
||||
cell0.set_property("editable", True)
|
||||
cell0.set_property("has-entry", False)
|
||||
col0 = gtk.TreeViewColumn("Configuration")
|
||||
col0.pack_start(cell0, False)
|
||||
col0.add_attribute(cell0, "text", 0)
|
||||
col0.set_cell_data_func(cell0, self.configuration_field)
|
||||
self.sstatemirrors_tv.append_column(col0)
|
||||
|
||||
cell0.connect("edited", self.combo_changed, sstatemirrors_store)
|
||||
|
||||
self.cell1 = gtk.CellRendererText()
|
||||
self.cell1.set_padding(5,2)
|
||||
col1 = gtk.TreeViewColumn('Regex', self.cell1)
|
||||
col1.set_cell_data_func(self.cell1, self.regex_field)
|
||||
self.sstatemirrors_tv.append_column(col1)
|
||||
|
||||
self.cell1.connect("edited", self.regex_changed, sstatemirrors_store)
|
||||
|
||||
cell2 = gtk.CellRendererText()
|
||||
cell2.set_padding(5,2)
|
||||
cell2.set_property("editable", True)
|
||||
col2 = gtk.TreeViewColumn('URL', cell2)
|
||||
col2.set_cell_data_func(cell2, self.url_field)
|
||||
self.sstatemirrors_tv.append_column(col2)
|
||||
|
||||
cell2.connect("edited", self.url_changed, sstatemirrors_store)
|
||||
|
||||
self.sstatemirrors_tv.set_model(sstatemirrors_store)
|
||||
self.sstatemirrors_tv.set_cursor(self.selected_mirror_row)
|
||||
hbox.pack_start(self.scroll, expand=True, fill=True)
|
||||
hbox.show_all()
|
||||
|
||||
return hbox, sstatemirrors_store
|
||||
|
||||
def drag_data_get_cb(self, treeview, context, selection, target_id, etime):
|
||||
treeselection = treeview.get_selection()
|
||||
model, iter = treeselection.get_selected()
|
||||
data = model.get_string_from_iter(iter)
|
||||
selection.set(selection.target, 8, data)
|
||||
|
||||
def drag_data_received_cb(self, treeview, context, x, y, selection, info, etime):
|
||||
model = treeview.get_model()
|
||||
data = []
|
||||
tree_iter = model.get_iter_from_string(selection.data)
|
||||
data.append(model.get_value(tree_iter, 0))
|
||||
data.append(model.get_value(tree_iter, 1))
|
||||
data.append(model.get_value(tree_iter, 2))
|
||||
|
||||
drop_info = treeview.get_dest_row_at_pos(x, y)
|
||||
if drop_info:
|
||||
path, position = drop_info
|
||||
iter = model.get_iter(path)
|
||||
if (position == gtk.TREE_VIEW_DROP_BEFORE or position == gtk.TREE_VIEW_DROP_INTO_OR_BEFORE):
|
||||
model.insert_before(iter, data)
|
||||
else:
|
||||
model.insert_after(iter, data)
|
||||
else:
|
||||
model.append(data)
|
||||
if context.action == gtk.gdk.ACTION_MOVE:
|
||||
context.finish(True, True, etime)
|
||||
return
|
||||
|
||||
def delete_cb(self, button):
|
||||
selection = self.sstatemirrors_tv.get_selection()
|
||||
tree_model, tree_iter = selection.get_selected()
|
||||
index = int(tree_model.get_string_from_iter(tree_iter))
|
||||
if index == 0:
|
||||
self.selected_mirror_row = index
|
||||
else:
|
||||
self.selected_mirror_row = index - 1
|
||||
self.sstatemirrors_list.pop(index)
|
||||
self.refresh_shared_state_page()
|
||||
if not self.sstatemirrors_list:
|
||||
self.delete_button.set_sensitive(False)
|
||||
|
||||
def add_mirror(self, button):
|
||||
self.new_mirror = True
|
||||
tooltip = "Select the pre-built mirror that will speed your build"
|
||||
index = len(self.sstatemirrors_list)
|
||||
self.selected_mirror_row = index
|
||||
sm_list = ["Standard", "", "file://(.*)"]
|
||||
self.sstatemirrors_list.append(sm_list)
|
||||
self.refresh_shared_state_page()
|
||||
|
||||
def scroll_changed(self, widget, event, data=None):
|
||||
if self.new_mirror == True:
|
||||
adj = widget.get_vadjustment()
|
||||
adj.set_value(adj.upper - adj.page_size)
|
||||
self.new_mirror = False
|
||||
|
||||
def combo_changed(self, widget, path, text, model):
|
||||
model[path][0] = text
|
||||
selection = self.sstatemirrors_tv.get_selection()
|
||||
tree_model, tree_iter = selection.get_selected()
|
||||
index = int(tree_model.get_string_from_iter(tree_iter))
|
||||
self.sstatemirrors_list[index][0] = text
|
||||
|
||||
def regex_changed(self, cell, path, new_text, user_data):
|
||||
user_data[path][2] = new_text
|
||||
selection = self.sstatemirrors_tv.get_selection()
|
||||
tree_model, tree_iter = selection.get_selected()
|
||||
index = int(tree_model.get_string_from_iter(tree_iter))
|
||||
self.sstatemirrors_list[index][2] = new_text
|
||||
return
|
||||
|
||||
def url_changed(self, cell, path, new_text, user_data):
|
||||
if new_text!="Enter the mirror URL" and new_text!="Match regex and replace it with this URL":
|
||||
user_data[path][1] = new_text
|
||||
selection = self.sstatemirrors_tv.get_selection()
|
||||
tree_model, tree_iter = selection.get_selected()
|
||||
index = int(tree_model.get_string_from_iter(tree_iter))
|
||||
self.sstatemirrors_list[index][1] = new_text
|
||||
return
|
||||
|
||||
def configuration_field(self, column, cell, model, iter):
|
||||
cell.set_property('text', model.get_value(iter, 0))
|
||||
if model.get_value(iter, 0) == "Standard":
|
||||
self.cell1.set_property("sensitive", False)
|
||||
self.cell1.set_property("editable", False)
|
||||
else:
|
||||
self.cell1.set_property("sensitive", True)
|
||||
self.cell1.set_property("editable", True)
|
||||
return
|
||||
|
||||
def regex_field(self, column, cell, model, iter):
|
||||
cell.set_property('text', model.get_value(iter, 2))
|
||||
return
|
||||
|
||||
def url_field(self, column, cell, model, iter):
|
||||
text = model.get_value(iter, 1)
|
||||
if text == "":
|
||||
if model.get_value(iter, 0) == "Standard":
|
||||
text = "Enter the mirror URL"
|
||||
else:
|
||||
text = "Match regex and replace it with this URL"
|
||||
cell.set_property('text', text)
|
||||
return
|
||||
|
||||
def refresh_shared_state_page(self):
|
||||
page_num = self.nb.get_current_page()
|
||||
self.nb.remove_page(page_num);
|
||||
self.nb.insert_page(self.create_shared_state_page(), gtk.Label("Shared state"),page_num)
|
||||
self.show_all()
|
||||
self.nb.set_current_page(page_num)
|
||||
|
||||
def test_proxy_ended(self, passed):
|
||||
self.proxy_test_running = False
|
||||
self.set_test_proxy_state(self.TEST_NETWORK_PASSED if passed else self.TEST_NETWORK_FAILED)
|
||||
self.set_sensitive(True)
|
||||
self.refresh_proxy_components()
|
||||
|
||||
def timer_func(self):
|
||||
self.test_proxy_progress.pulse()
|
||||
return self.proxy_test_running
|
||||
|
||||
def test_network_button_cb(self, b):
|
||||
self.set_test_proxy_state(self.TEST_NETWORK_RUNNING)
|
||||
self.set_sensitive(False)
|
||||
self.save_proxy_data()
|
||||
if self.configuration.enable_proxy == True:
|
||||
self.handler.set_http_proxy(self.configuration.combine_proxy("http"))
|
||||
self.handler.set_https_proxy(self.configuration.combine_proxy("https"))
|
||||
self.handler.set_ftp_proxy(self.configuration.combine_proxy("ftp"))
|
||||
self.handler.set_socks_proxy(self.configuration.combine_proxy("socks"))
|
||||
self.handler.set_cvs_proxy(self.configuration.combine_host_only("cvs"), self.configuration.combine_port_only("cvs"))
|
||||
elif self.configuration.enable_proxy == False:
|
||||
self.handler.set_http_proxy("")
|
||||
self.handler.set_https_proxy("")
|
||||
self.handler.set_ftp_proxy("")
|
||||
self.handler.set_socks_proxy("")
|
||||
self.handler.set_cvs_proxy("", "")
|
||||
self.proxy_test_ran = True
|
||||
self.proxy_test_running = True
|
||||
gobject.timeout_add(100, self.timer_func)
|
||||
self.handler.trigger_network_test()
|
||||
|
||||
def test_proxy_focus_event(self, w, direction):
|
||||
if self.test_proxy_state in [self.TEST_NETWORK_PASSED, self.TEST_NETWORK_FAILED]:
|
||||
self.set_test_proxy_state(self.TEST_NETWORK_INITIAL)
|
||||
return False
|
||||
|
||||
def http_proxy_changed(self, e):
|
||||
if not self.configuration.same_proxy:
|
||||
return
|
||||
if e == self.http_proxy:
|
||||
[w.set_text(self.http_proxy.get_text()) for w in self.same_proxy_addresses]
|
||||
else:
|
||||
[w.set_text(self.http_proxy_port.get_text()) for w in self.same_proxy_ports]
|
||||
|
||||
def proxy_address_focus_out_event(self, w, direction):
|
||||
text = w.get_text()
|
||||
if not text:
|
||||
return False
|
||||
if text.find("//") == -1:
|
||||
w.set_text("http://" + text)
|
||||
return False
|
||||
|
||||
def set_test_proxy_state(self, state):
|
||||
if self.test_proxy_state == state:
|
||||
return
|
||||
[self.proxy_table.remove(w) for w in self.test_gui_elements]
|
||||
if state == self.TEST_NETWORK_INITIAL:
|
||||
self.proxy_table.attach(self.test_network_button, 1, 2, 5, 6)
|
||||
self.test_network_button.show()
|
||||
elif state == self.TEST_NETWORK_RUNNING:
|
||||
self.test_proxy_progress.set_rcstyle("running")
|
||||
self.test_proxy_progress.set_text("Testing network configuration")
|
||||
self.proxy_table.attach(self.test_proxy_progress, 0, 5, 5, 6, xpadding=4)
|
||||
self.test_proxy_progress.show()
|
||||
else: # passed or failed
|
||||
self.dummy_progress.update(1.0)
|
||||
if state == self.TEST_NETWORK_PASSED:
|
||||
self.dummy_progress.set_text("Your network is properly configured")
|
||||
self.dummy_progress.set_rcstyle("running")
|
||||
else:
|
||||
self.dummy_progress.set_text("Network test failed")
|
||||
self.dummy_progress.set_rcstyle("fail")
|
||||
self.proxy_table.attach(self.dummy_progress, 0, 4, 5, 6)
|
||||
self.proxy_table.attach(self.retest_network_button, 4, 5, 5, 6, xpadding=4)
|
||||
self.dummy_progress.show()
|
||||
self.retest_network_button.show()
|
||||
self.test_proxy_state = state
|
||||
|
||||
def create_network_page(self):
|
||||
advanced_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.set_border_width(6)
|
||||
self.same_proxy_addresses = []
|
||||
self.same_proxy_ports = []
|
||||
self.all_proxy_ports = []
|
||||
self.all_proxy_addresses = []
|
||||
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
|
||||
label = self.gen_label_widget("<span weight=\"bold\">Set the proxies used when fetching source code</span>")
|
||||
tooltip = "Set the proxies used when fetching source code. A blank field uses a direct internet connection."
|
||||
info = HobInfoButton("<span weight=\"bold\">Set the proxies used when fetching source code</span>" + "*" + tooltip, self)
|
||||
hbox = gtk.HBox(False, 12)
|
||||
hbox.pack_start(label, expand=True, fill=True)
|
||||
hbox.pack_start(info, expand=False, fill=False)
|
||||
sub_vbox.pack_start(hbox, expand=False, fill=False)
|
||||
|
||||
proxy_test_focus = []
|
||||
self.direct_checkbox = gtk.RadioButton(None, "Direct network connection")
|
||||
proxy_test_focus.append(self.direct_checkbox)
|
||||
self.direct_checkbox.set_tooltip_text("Check this box to use a direct internet connection with no proxy")
|
||||
self.direct_checkbox.set_active(not self.configuration.enable_proxy)
|
||||
sub_vbox.pack_start(self.direct_checkbox, expand=False, fill=False)
|
||||
|
||||
self.proxy_checkbox = gtk.RadioButton(self.direct_checkbox, "Manual proxy configuration")
|
||||
proxy_test_focus.append(self.proxy_checkbox)
|
||||
self.proxy_checkbox.set_tooltip_text("Check this box to manually set up a specific proxy")
|
||||
self.proxy_checkbox.set_active(self.configuration.enable_proxy)
|
||||
sub_vbox.pack_start(self.proxy_checkbox, expand=False, fill=False)
|
||||
|
||||
self.same_checkbox = gtk.CheckButton("Use the HTTP proxy for all protocols")
|
||||
proxy_test_focus.append(self.same_checkbox)
|
||||
self.same_checkbox.set_tooltip_text("Check this box to use the HTTP proxy for all five proxies")
|
||||
self.same_checkbox.set_active(self.configuration.same_proxy)
|
||||
hbox = gtk.HBox(False, 12)
|
||||
hbox.pack_start(self.same_checkbox, expand=False, fill=False, padding=24)
|
||||
sub_vbox.pack_start(hbox, expand=False, fill=False)
|
||||
|
||||
self.proxy_table = gtk.Table(6, 5, False)
|
||||
self.http_proxy, self.http_proxy_port, self.http_proxy_details = self.gen_proxy_entry_widget(
|
||||
"http", self, True, 0)
|
||||
proxy_test_focus +=[self.http_proxy, self.http_proxy_port]
|
||||
self.http_proxy.connect("changed", self.http_proxy_changed)
|
||||
self.http_proxy_port.connect("changed", self.http_proxy_changed)
|
||||
|
||||
self.https_proxy, self.https_proxy_port, self.https_proxy_details = self.gen_proxy_entry_widget(
|
||||
"https", self, True, 1)
|
||||
proxy_test_focus += [self.https_proxy, self.https_proxy_port]
|
||||
self.same_proxy_addresses.append(self.https_proxy)
|
||||
self.same_proxy_ports.append(self.https_proxy_port)
|
||||
|
||||
self.ftp_proxy, self.ftp_proxy_port, self.ftp_proxy_details = self.gen_proxy_entry_widget(
|
||||
"ftp", self, True, 2)
|
||||
proxy_test_focus += [self.ftp_proxy, self.ftp_proxy_port]
|
||||
self.same_proxy_addresses.append(self.ftp_proxy)
|
||||
self.same_proxy_ports.append(self.ftp_proxy_port)
|
||||
|
||||
self.socks_proxy, self.socks_proxy_port, self.socks_proxy_details = self.gen_proxy_entry_widget(
|
||||
"socks", self, True, 3)
|
||||
proxy_test_focus += [self.socks_proxy, self.socks_proxy_port]
|
||||
self.same_proxy_addresses.append(self.socks_proxy)
|
||||
self.same_proxy_ports.append(self.socks_proxy_port)
|
||||
|
||||
self.cvs_proxy, self.cvs_proxy_port, self.cvs_proxy_details = self.gen_proxy_entry_widget(
|
||||
"cvs", self, True, 4)
|
||||
proxy_test_focus += [self.cvs_proxy, self.cvs_proxy_port]
|
||||
self.same_proxy_addresses.append(self.cvs_proxy)
|
||||
self.same_proxy_ports.append(self.cvs_proxy_port)
|
||||
self.all_proxy_ports = self.same_proxy_ports + [self.http_proxy_port]
|
||||
self.all_proxy_addresses = self.same_proxy_addresses + [self.http_proxy]
|
||||
sub_vbox.pack_start(self.proxy_table, expand=False, fill=False)
|
||||
self.proxy_table.show_all()
|
||||
|
||||
# Create the graphical elements for the network test feature, but don't display them yet
|
||||
self.test_network_button = HobAltButton("Test network configuration")
|
||||
self.test_network_button.connect("clicked", self.test_network_button_cb)
|
||||
self.test_proxy_progress = HobProgressBar()
|
||||
self.dummy_progress = HobProgressBar()
|
||||
self.retest_network_button = HobAltButton("Retest")
|
||||
self.retest_network_button.connect("clicked", self.test_network_button_cb)
|
||||
self.test_gui_elements = [self.test_network_button, self.test_proxy_progress, self.dummy_progress, self.retest_network_button]
|
||||
# Initialize the network tester
|
||||
self.test_proxy_state = self.TEST_NETWORK_NONE
|
||||
self.set_test_proxy_state(self.TEST_NETWORK_INITIAL)
|
||||
self.proxy_test_passed_id = self.handler.connect("network-passed", lambda h:self.test_proxy_ended(True))
|
||||
self.proxy_test_failed_id = self.handler.connect("network-failed", lambda h:self.test_proxy_ended(False))
|
||||
[w.connect("focus-in-event", self.test_proxy_focus_event) for w in proxy_test_focus]
|
||||
[w.connect("focus-out-event", self.proxy_address_focus_out_event) for w in self.all_proxy_addresses]
|
||||
|
||||
self.direct_checkbox.connect("toggled", self.proxy_checkbox_toggled_cb)
|
||||
self.proxy_checkbox.connect("toggled", self.proxy_checkbox_toggled_cb)
|
||||
self.same_checkbox.connect("toggled", self.same_checkbox_toggled_cb)
|
||||
|
||||
self.refresh_proxy_components()
|
||||
return advanced_vbox
|
||||
|
||||
def switch_to_page(self, page_id):
|
||||
self.nb.set_current_page(page_id)
|
||||
|
||||
def details_cb(self, button, parent, protocol):
|
||||
self.save_proxy_data()
|
||||
dialog = ProxyDetailsDialog(title = protocol.upper() + " Proxy Details",
|
||||
user = self.configuration.proxies[protocol][1],
|
||||
passwd = self.configuration.proxies[protocol][2],
|
||||
parent = parent,
|
||||
flags = gtk.DIALOG_MODAL
|
||||
| gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
|
||||
response = dialog.run()
|
||||
if response == gtk.RESPONSE_OK:
|
||||
self.configuration.proxies[protocol][1] = dialog.user
|
||||
self.configuration.proxies[protocol][2] = dialog.passwd
|
||||
self.refresh_proxy_components()
|
||||
dialog.destroy()
|
||||
|
||||
def rootfs_combo_changed_cb(self, rootfs_combo, all_package_format, check_hbox):
|
||||
combo_item = self.rootfs_combo.get_active_text()
|
||||
for child in check_hbox.get_children():
|
||||
if isinstance(child, gtk.CheckButton):
|
||||
check_hbox.remove(child)
|
||||
for format in all_package_format:
|
||||
if format != combo_item:
|
||||
check_button = gtk.CheckButton(format)
|
||||
check_hbox.pack_start(check_button, expand=False, fill=False)
|
||||
check_hbox.show_all()
|
||||
|
||||
def gen_pkgfmt_widget(self, curr_package_format, all_package_format, tooltip_combo="", tooltip_extra=""):
|
||||
pkgfmt_hbox = gtk.HBox(False, 24)
|
||||
|
||||
rootfs_vbox = gtk.VBox(False, 6)
|
||||
pkgfmt_hbox.pack_start(rootfs_vbox, expand=False, fill=False)
|
||||
|
||||
label = self.gen_label_widget("Root file system package format")
|
||||
rootfs_vbox.pack_start(label, expand=False, fill=False)
|
||||
|
||||
rootfs_format = ""
|
||||
if curr_package_format:
|
||||
rootfs_format = curr_package_format.split()[0]
|
||||
|
||||
rootfs_format_widget, rootfs_combo = self.gen_combo_widget(rootfs_format, all_package_format, tooltip_combo)
|
||||
rootfs_vbox.pack_start(rootfs_format_widget, expand=False, fill=False)
|
||||
|
||||
extra_vbox = gtk.VBox(False, 6)
|
||||
pkgfmt_hbox.pack_start(extra_vbox, expand=False, fill=False)
|
||||
|
||||
label = self.gen_label_widget("Additional package formats")
|
||||
extra_vbox.pack_start(label, expand=False, fill=False)
|
||||
|
||||
check_hbox = gtk.HBox(False, 12)
|
||||
extra_vbox.pack_start(check_hbox, expand=False, fill=False)
|
||||
for format in all_package_format:
|
||||
if format != rootfs_format:
|
||||
check_button = gtk.CheckButton(format)
|
||||
is_active = (format in curr_package_format.split())
|
||||
check_button.set_active(is_active)
|
||||
check_hbox.pack_start(check_button, expand=False, fill=False)
|
||||
|
||||
info = HobInfoButton(tooltip_extra, self)
|
||||
check_hbox.pack_end(info, expand=False, fill=False)
|
||||
|
||||
rootfs_combo.connect("changed", self.rootfs_combo_changed_cb, all_package_format, check_hbox)
|
||||
|
||||
pkgfmt_hbox.show_all()
|
||||
|
||||
return pkgfmt_hbox, rootfs_combo, check_hbox
|
||||
|
||||
def editable_settings_cell_edited(self, cell, path_string, new_text, model):
|
||||
it = model.get_iter_from_string(path_string)
|
||||
column = cell.get_data("column")
|
||||
model.set(it, column, new_text)
|
||||
|
||||
def editable_settings_add_item_clicked(self, button, model):
|
||||
new_item = ["##KEY##", "##VALUE##"]
|
||||
|
||||
iter = model.append()
|
||||
model.set (iter,
|
||||
0, new_item[0],
|
||||
1, new_item[1],
|
||||
)
|
||||
|
||||
def editable_settings_remove_item_clicked(self, button, treeview):
|
||||
selection = treeview.get_selection()
|
||||
model, iter = selection.get_selected()
|
||||
|
||||
if iter:
|
||||
path = model.get_path(iter)[0]
|
||||
model.remove(iter)
|
||||
|
||||
def gen_editable_settings(self, setting, tooltip=""):
|
||||
setting_hbox = gtk.HBox(False, 12)
|
||||
|
||||
vbox = gtk.VBox(False, 12)
|
||||
setting_hbox.pack_start(vbox, expand=True, fill=True)
|
||||
|
||||
setting_store = gtk.ListStore(gobject.TYPE_STRING, gobject.TYPE_STRING)
|
||||
for key in setting.keys():
|
||||
setting_store.set(setting_store.append(), 0, key, 1, setting[key])
|
||||
|
||||
setting_tree = gtk.TreeView(setting_store)
|
||||
setting_tree.set_headers_visible(True)
|
||||
setting_tree.set_size_request(300, 100)
|
||||
|
||||
col = gtk.TreeViewColumn('Key')
|
||||
col.set_min_width(100)
|
||||
col.set_max_width(150)
|
||||
col.set_resizable(True)
|
||||
col1 = gtk.TreeViewColumn('Value')
|
||||
col1.set_min_width(100)
|
||||
col1.set_max_width(150)
|
||||
col1.set_resizable(True)
|
||||
setting_tree.append_column(col)
|
||||
setting_tree.append_column(col1)
|
||||
cell = gtk.CellRendererText()
|
||||
cell.set_property('width-chars', 10)
|
||||
cell.set_property('editable', True)
|
||||
cell.set_data("column", 0)
|
||||
cell.connect("edited", self.editable_settings_cell_edited, setting_store)
|
||||
cell1 = gtk.CellRendererText()
|
||||
cell1.set_property('width-chars', 10)
|
||||
cell1.set_property('editable', True)
|
||||
cell1.set_data("column", 1)
|
||||
cell1.connect("edited", self.editable_settings_cell_edited, setting_store)
|
||||
col.pack_start(cell, True)
|
||||
col1.pack_end(cell1, True)
|
||||
col.set_attributes(cell, text=0)
|
||||
col1.set_attributes(cell1, text=1)
|
||||
|
||||
scroll = gtk.ScrolledWindow()
|
||||
scroll.set_shadow_type(gtk.SHADOW_IN)
|
||||
scroll.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
|
||||
scroll.add(setting_tree)
|
||||
vbox.pack_start(scroll, expand=True, fill=True)
|
||||
|
||||
# some buttons
|
||||
hbox = gtk.HBox(True, 6)
|
||||
vbox.pack_start(hbox, False, False)
|
||||
|
||||
button = gtk.Button(stock=gtk.STOCK_ADD)
|
||||
button.connect("clicked", self.editable_settings_add_item_clicked, setting_store)
|
||||
hbox.pack_start(button)
|
||||
|
||||
button = gtk.Button(stock=gtk.STOCK_REMOVE)
|
||||
button.connect("clicked", self.editable_settings_remove_item_clicked, setting_tree)
|
||||
hbox.pack_start(button)
|
||||
|
||||
info = HobInfoButton(tooltip, self)
|
||||
setting_hbox.pack_start(info, expand=False, fill=False)
|
||||
|
||||
return setting_hbox, setting_store
|
||||
|
||||
def create_others_page(self):
|
||||
advanced_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.set_border_width(6)
|
||||
|
||||
sub_vbox = gtk.VBox(False, 6)
|
||||
advanced_vbox.pack_start(sub_vbox, expand=True, fill=True)
|
||||
label = self.gen_label_widget("<span weight=\"bold\">Add your own variables:</span>")
|
||||
tooltip = "These are key/value pairs for your extra settings. Click \'Add\' and then directly edit the key and the value"
|
||||
setting_widget, self.setting_store = self.gen_editable_settings(self.configuration.extra_setting,"<b>Add your own variables</b>" + "*" + tooltip)
|
||||
sub_vbox.pack_start(label, expand=False, fill=False)
|
||||
sub_vbox.pack_start(setting_widget, expand=True, fill=True)
|
||||
|
||||
return advanced_vbox
|
||||
|
||||
def create_visual_elements(self):
|
||||
self.nb = gtk.Notebook()
|
||||
self.nb.set_show_tabs(True)
|
||||
self.nb.append_page(self.create_build_environment_page(), gtk.Label("Build environment"))
|
||||
self.nb.append_page(self.create_shared_state_page(), gtk.Label("Shared state"))
|
||||
self.nb.append_page(self.create_network_page(), gtk.Label("Network"))
|
||||
self.nb.append_page(self.create_others_page(), gtk.Label("Others"))
|
||||
self.nb.set_current_page(0)
|
||||
self.vbox.pack_start(self.nb, expand=True, fill=True)
|
||||
self.vbox.pack_end(gtk.HSeparator(), expand=True, fill=True)
|
||||
|
||||
self.show_all()
|
||||
|
||||
def destroy(self):
|
||||
self.handler.disconnect(self.proxy_test_passed_id)
|
||||
self.handler.disconnect(self.proxy_test_failed_id)
|
||||
super(SimpleSettingsDialog, self).destroy()
|
||||
@@ -30,7 +30,6 @@ class HobColors:
|
||||
BLACK = "#000000"
|
||||
PALE_BLUE = "#53b8ff"
|
||||
DEEP_RED = "#aa3e3e"
|
||||
KHAKI = "#fff68f"
|
||||
|
||||
OK = WHITE
|
||||
RUNNING = PALE_GREEN
|
||||
|
||||
@@ -41,9 +41,6 @@ class HobHandler(gobject.GObject):
|
||||
"command-failed" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
(gobject.TYPE_STRING,)),
|
||||
"parsing-warning" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
(gobject.TYPE_STRING,)),
|
||||
"sanity-failed" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
(gobject.TYPE_STRING, gobject.TYPE_INT)),
|
||||
@@ -68,17 +65,10 @@ class HobHandler(gobject.GObject):
|
||||
"package-populated" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
"network-passed" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
"network-failed" : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
}
|
||||
|
||||
(GENERATE_CONFIGURATION, GENERATE_RECIPES, GENERATE_PACKAGES, GENERATE_IMAGE, POPULATE_PACKAGEINFO, SANITY_CHECK, NETWORK_TEST) = range(7)
|
||||
(SUB_PATH_LAYERS, SUB_FILES_DISTRO, SUB_FILES_MACH, SUB_FILES_SDKMACH, SUB_MATCH_CLASS, SUB_PARSE_CONFIG, SUB_SANITY_CHECK,
|
||||
SUB_GNERATE_TGTS, SUB_GENERATE_PKGINFO, SUB_BUILD_RECIPES, SUB_BUILD_IMAGE, SUB_NETWORK_TEST) = range(12)
|
||||
(GENERATE_CONFIGURATION, GENERATE_RECIPES, GENERATE_PACKAGES, GENERATE_IMAGE, POPULATE_PACKAGEINFO, SANITY_CHECK) = range(6)
|
||||
(SUB_PATH_LAYERS, SUB_FILES_DISTRO, SUB_FILES_MACH, SUB_FILES_SDKMACH, SUB_MATCH_CLASS, SUB_PARSE_CONFIG, SUB_SANITY_CHECK, SUB_GNERATE_TGTS, SUB_GENERATE_PKGINFO, SUB_BUILD_RECIPES, SUB_BUILD_IMAGE) = range(11)
|
||||
|
||||
def __init__(self, server, recipe_model, package_model):
|
||||
super(HobHandler, self).__init__()
|
||||
@@ -98,7 +88,6 @@ class HobHandler(gobject.GObject):
|
||||
self.server = server
|
||||
self.error_msg = ""
|
||||
self.initcmd = None
|
||||
self.parsing = False
|
||||
|
||||
def set_busy(self):
|
||||
if not self.generating:
|
||||
@@ -112,9 +101,13 @@ class HobHandler(gobject.GObject):
|
||||
|
||||
def runCommand(self, commandline):
|
||||
try:
|
||||
result, error = self.server.runCommand(commandline)
|
||||
if error:
|
||||
raise Exception("Error running command '%s': %s" % (commandline, error))
|
||||
result = self.server.runCommand(commandline)
|
||||
result_str = str(result)
|
||||
if (result_str.startswith("Busy (") or
|
||||
result_str == "No such command"):
|
||||
raise Exception('%s has failed with output "%s". ' %
|
||||
(str(commandline), result_str) +
|
||||
"We recommend that you restart Hob.")
|
||||
return result
|
||||
except Exception as e:
|
||||
self.commands_async = []
|
||||
@@ -146,17 +139,13 @@ class HobHandler(gobject.GObject):
|
||||
elif next_command == self.SUB_MATCH_CLASS:
|
||||
self.runCommand(["findFilesMatchingInDir", "rootfs_", "classes"])
|
||||
elif next_command == self.SUB_PARSE_CONFIG:
|
||||
self.runCommand(["enableDataTracking"])
|
||||
self.runCommand(["parseConfigurationFiles", "", ""])
|
||||
self.runCommand(["disableDataTracking"])
|
||||
elif next_command == self.SUB_GNERATE_TGTS:
|
||||
self.runCommand(["generateTargetsTree", "classes/image.bbclass", []])
|
||||
elif next_command == self.SUB_GENERATE_PKGINFO:
|
||||
self.runCommand(["triggerEvent", "bb.event.RequestPackageInfo()"])
|
||||
elif next_command == self.SUB_SANITY_CHECK:
|
||||
self.runCommand(["triggerEvent", "bb.event.SanityCheck()"])
|
||||
elif next_command == self.SUB_NETWORK_TEST:
|
||||
self.runCommand(["triggerEvent", "bb.event.NetworkTest()"])
|
||||
elif next_command == self.SUB_BUILD_RECIPES:
|
||||
self.clear_busy()
|
||||
self.building = True
|
||||
@@ -172,14 +161,6 @@ class HobHandler(gobject.GObject):
|
||||
if self.toolchain_packages:
|
||||
self.runCommand(["setVariable", "TOOLCHAIN_TARGET_TASK", " ".join(self.toolchain_packages)])
|
||||
targets.append(self.toolchain)
|
||||
if targets[0] == "hob-image":
|
||||
hobImage = self.runCommand(["matchFile", "hob-image.bb"])
|
||||
if self.base_image != "Create your own image":
|
||||
baseImage = self.runCommand(["matchFile", self.base_image + ".bb"])
|
||||
version = self.runCommand(["generateNewImage", hobImage, baseImage, self.package_queue])
|
||||
targets[0] += version
|
||||
self.recipe_model.set_custom_image_version(version)
|
||||
|
||||
self.runCommand(["buildTargets", targets, self.default_task])
|
||||
|
||||
def display_error(self):
|
||||
@@ -202,10 +183,6 @@ class HobHandler(gobject.GObject):
|
||||
self.run_next_command()
|
||||
|
||||
elif isinstance(event, bb.event.SanityCheckPassed):
|
||||
reparse = self.runCommand(["getVariable", "BB_INVALIDCONF"]) or None
|
||||
if reparse is True:
|
||||
self.runCommand(["setVariable", "BB_INVALIDCONF", False])
|
||||
self.runCommand(["parseConfigurationFiles", "", ""])
|
||||
self.run_next_command()
|
||||
|
||||
elif isinstance(event, bb.event.SanityCheckFailed):
|
||||
@@ -217,11 +194,6 @@ class HobHandler(gobject.GObject):
|
||||
formatter = bb.msg.BBLogFormatter()
|
||||
msg = formatter.format(event)
|
||||
self.error_msg += msg + '\n'
|
||||
elif event.levelno >= logging.WARNING and self.parsing == True:
|
||||
formatter = bb.msg.BBLogFormatter()
|
||||
msg = formatter.format(event)
|
||||
warn_msg = msg + '\n'
|
||||
self.emit("parsing-warning", warn_msg)
|
||||
|
||||
elif isinstance(event, bb.event.TargetsTreeGenerated):
|
||||
self.current_phase = "data generation"
|
||||
@@ -262,10 +234,8 @@ class HobHandler(gobject.GObject):
|
||||
message["eventname"] = bb.event.getName(event)
|
||||
message["current"] = 0
|
||||
message["total"] = None
|
||||
message["title"] = "Parsing recipes"
|
||||
message["title"] = "Parsing recipes: "
|
||||
self.emit("parsing-started", message)
|
||||
if isinstance(event, bb.event.ParseStarted):
|
||||
self.parsing = True
|
||||
elif isinstance(event, (bb.event.ParseProgress,
|
||||
bb.event.CacheLoadProgress,
|
||||
bb.event.TreeDataPreparationProgress)):
|
||||
@@ -273,7 +243,7 @@ class HobHandler(gobject.GObject):
|
||||
message["eventname"] = bb.event.getName(event)
|
||||
message["current"] = event.current
|
||||
message["total"] = event.total
|
||||
message["title"] = "Parsing recipes"
|
||||
message["title"] = "Parsing recipes: "
|
||||
self.emit("parsing", message)
|
||||
elif isinstance(event, (bb.event.ParseCompleted,
|
||||
bb.event.CacheLoadCompleted,
|
||||
@@ -282,16 +252,8 @@ class HobHandler(gobject.GObject):
|
||||
message["eventname"] = bb.event.getName(event)
|
||||
message["current"] = event.total
|
||||
message["total"] = event.total
|
||||
message["title"] = "Parsing recipes"
|
||||
message["title"] = "Parsing recipes: "
|
||||
self.emit("parsing-completed", message)
|
||||
if isinstance(event, bb.event.ParseCompleted):
|
||||
self.parsing = False
|
||||
elif isinstance(event, bb.event.NetworkTestFailed):
|
||||
self.emit("network-failed")
|
||||
self.run_next_command()
|
||||
elif isinstance(event, bb.event.NetworkTestPassed):
|
||||
self.emit("network-passed")
|
||||
self.run_next_command()
|
||||
|
||||
if self.error_msg and not self.commands_async:
|
||||
self.display_error()
|
||||
@@ -307,42 +269,42 @@ class HobHandler(gobject.GObject):
|
||||
self.runCommand(["setVariable", "INHERIT", inherits])
|
||||
|
||||
def set_bblayers(self, bblayers):
|
||||
self.runCommand(["setVariable", "BBLAYERS", " ".join(bblayers)])
|
||||
self.runCommand(["setVariable", "BBLAYERS_HOB", " ".join(bblayers)])
|
||||
|
||||
def set_machine(self, machine):
|
||||
if machine:
|
||||
self.runCommand(["setVariable", "MACHINE", machine])
|
||||
self.runCommand(["setVariable", "MACHINE_HOB", machine])
|
||||
|
||||
def set_sdk_machine(self, sdk_machine):
|
||||
self.runCommand(["setVariable", "SDKMACHINE", sdk_machine])
|
||||
self.runCommand(["setVariable", "SDKMACHINE_HOB", sdk_machine])
|
||||
|
||||
def set_image_fstypes(self, image_fstypes):
|
||||
self.runCommand(["setVariable", "IMAGE_FSTYPES", image_fstypes])
|
||||
|
||||
def set_distro(self, distro):
|
||||
self.runCommand(["setVariable", "DISTRO", distro])
|
||||
self.runCommand(["setVariable", "DISTRO_HOB", distro])
|
||||
|
||||
def set_package_format(self, format):
|
||||
package_classes = ""
|
||||
for pkgfmt in format.split():
|
||||
package_classes += ("package_%s" % pkgfmt + " ")
|
||||
self.runCommand(["setVariable", "PACKAGE_CLASSES", package_classes])
|
||||
self.runCommand(["setVariable", "PACKAGE_CLASSES_HOB", package_classes])
|
||||
|
||||
def set_bbthreads(self, threads):
|
||||
self.runCommand(["setVariable", "BB_NUMBER_THREADS", threads])
|
||||
self.runCommand(["setVariable", "BB_NUMBER_THREADS_HOB", threads])
|
||||
|
||||
def set_pmake(self, threads):
|
||||
pmake = "-j %s" % threads
|
||||
self.runCommand(["setVariable", "PARALLEL_MAKE", pmake])
|
||||
self.runCommand(["setVariable", "PARALLEL_MAKE_HOB", pmake])
|
||||
|
||||
def set_dl_dir(self, directory):
|
||||
self.runCommand(["setVariable", "DL_DIR", directory])
|
||||
self.runCommand(["setVariable", "DL_DIR_HOB", directory])
|
||||
|
||||
def set_sstate_dir(self, directory):
|
||||
self.runCommand(["setVariable", "SSTATE_DIR", directory])
|
||||
self.runCommand(["setVariable", "SSTATE_DIR_HOB", directory])
|
||||
|
||||
def set_sstate_mirrors(self, url):
|
||||
self.runCommand(["setVariable", "SSTATE_MIRRORS", url])
|
||||
self.runCommand(["setVariable", "SSTATE_MIRRORS_HOB", url])
|
||||
|
||||
def set_extra_size(self, image_extra_size):
|
||||
self.runCommand(["setVariable", "IMAGE_ROOTFS_EXTRA_SPACE", str(image_extra_size)])
|
||||
@@ -351,13 +313,16 @@ class HobHandler(gobject.GObject):
|
||||
self.runCommand(["setVariable", "IMAGE_ROOTFS_SIZE", str(image_rootfs_size)])
|
||||
|
||||
def set_incompatible_license(self, incompat_license):
|
||||
self.runCommand(["setVariable", "INCOMPATIBLE_LICENSE", incompat_license])
|
||||
self.runCommand(["setVariable", "INCOMPATIBLE_LICENSE_HOB", incompat_license])
|
||||
|
||||
def set_extra_config(self, extra_setting):
|
||||
for key in extra_setting.keys():
|
||||
value = extra_setting[key]
|
||||
self.runCommand(["setVariable", key, value])
|
||||
|
||||
def set_config_filter(self, config_filter):
|
||||
self.runCommand(["setConfFilter", config_filter])
|
||||
|
||||
def set_http_proxy(self, http_proxy):
|
||||
self.runCommand(["setVariable", "http_proxy", http_proxy])
|
||||
|
||||
@@ -367,8 +332,9 @@ class HobHandler(gobject.GObject):
|
||||
def set_ftp_proxy(self, ftp_proxy):
|
||||
self.runCommand(["setVariable", "ftp_proxy", ftp_proxy])
|
||||
|
||||
def set_socks_proxy(self, socks_proxy):
|
||||
self.runCommand(["setVariable", "all_proxy", socks_proxy])
|
||||
def set_git_proxy(self, host, port):
|
||||
self.runCommand(["setVariable", "GIT_PROXY_HOST", host])
|
||||
self.runCommand(["setVariable", "GIT_PROXY_PORT", port])
|
||||
|
||||
def set_cvs_proxy(self, host, port):
|
||||
self.runCommand(["setVariable", "CVS_PROXY_HOST", host])
|
||||
@@ -382,10 +348,6 @@ class HobHandler(gobject.GObject):
|
||||
self.commands_async.append(self.SUB_SANITY_CHECK)
|
||||
self.run_next_command(self.SANITY_CHECK)
|
||||
|
||||
def trigger_network_test(self):
|
||||
self.commands_async.append(self.SUB_NETWORK_TEST)
|
||||
self.run_next_command(self.NETWORK_TEST)
|
||||
|
||||
def generate_configuration(self):
|
||||
self.commands_async.append(self.SUB_PARSE_CONFIG)
|
||||
self.commands_async.append(self.SUB_PATH_LAYERS)
|
||||
@@ -409,9 +371,8 @@ class HobHandler(gobject.GObject):
|
||||
self.commands_async.append(self.SUB_BUILD_RECIPES)
|
||||
self.run_next_command(self.GENERATE_PACKAGES)
|
||||
|
||||
def generate_image(self, image, base_image, toolchain, image_packages=[], toolchain_packages=[], default_task="build"):
|
||||
def generate_image(self, image, toolchain, image_packages=[], toolchain_packages=[], default_task="build"):
|
||||
self.image = image
|
||||
self.base_image = base_image
|
||||
self.toolchain = toolchain
|
||||
self.package_queue = image_packages
|
||||
self.toolchain_packages = toolchain_packages
|
||||
@@ -444,7 +405,7 @@ class HobHandler(gobject.GObject):
|
||||
self.build.reset()
|
||||
|
||||
def get_logfile(self):
|
||||
return self.server.runCommand(["getVariable", "BB_CONSOLELOG"])[0]
|
||||
return self.server.runCommand(["getVariable", "BB_CONSOLELOG"])
|
||||
|
||||
def _remove_redundant(self, string):
|
||||
ret = []
|
||||
@@ -453,20 +414,14 @@ class HobHandler(gobject.GObject):
|
||||
ret.append(i)
|
||||
return " ".join(ret)
|
||||
|
||||
def set_var_in_file(self, var, val, default_file=None):
|
||||
self.server.runCommand(["setVarFile", var, val, default_file])
|
||||
|
||||
def get_parameters(self):
|
||||
# retrieve the parameters from bitbake
|
||||
params = {}
|
||||
params["core_base"] = self.runCommand(["getVariable", "COREBASE"]) or ""
|
||||
hob_layer = params["core_base"] + "/meta-hob"
|
||||
params["layer"] = self.runCommand(["getVariable", "BBLAYERS"]) or ""
|
||||
params["layers_non_removable"] = self.runCommand(["getVariable", "BBLAYERS_NON_REMOVABLE"]) or ""
|
||||
if hob_layer not in params["layer"].split():
|
||||
params["layer"] += (" " + hob_layer)
|
||||
if hob_layer not in params["layers_non_removable"].split():
|
||||
params["layers_non_removable"] += (" " + hob_layer)
|
||||
params["dldir"] = self.runCommand(["getVariable", "DL_DIR"]) or ""
|
||||
params["machine"] = self.runCommand(["getVariable", "MACHINE"]) or ""
|
||||
params["distro"] = self.runCommand(["getVariable", "DISTRO"]) or "defaultsetup"
|
||||
@@ -564,7 +519,9 @@ class HobHandler(gobject.GObject):
|
||||
|
||||
params["default_task"] = self.runCommand(["getVariable", "BB_DEFAULT_TASK"]) or "build"
|
||||
|
||||
params["socks_proxy"] = self.runCommand(["getVariable", "all_proxy"]) or ""
|
||||
params["git_proxy_host"] = self.runCommand(["getVariable", "GIT_PROXY_HOST"]) or ""
|
||||
params["git_proxy_port"] = self.runCommand(["getVariable", "GIT_PROXY_PORT"]) or ""
|
||||
|
||||
params["http_proxy"] = self.runCommand(["getVariable", "http_proxy"]) or ""
|
||||
params["ftp_proxy"] = self.runCommand(["getVariable", "ftp_proxy"]) or ""
|
||||
params["https_proxy"] = self.runCommand(["getVariable", "https_proxy"]) or ""
|
||||
|
||||
@@ -27,15 +27,14 @@ from bb.ui.crumbs.hobpages import HobPage
|
||||
#
|
||||
# PackageListModel
|
||||
#
|
||||
class PackageListModel(gtk.ListStore):
|
||||
class PackageListModel(gtk.TreeStore):
|
||||
"""
|
||||
This class defines an gtk.ListStore subclass which will convert the output
|
||||
of the bb.event.TargetsTreeGenerated event into a gtk.ListStore whilst also
|
||||
This class defines an gtk.TreeStore subclass which will convert the output
|
||||
of the bb.event.TargetsTreeGenerated event into a gtk.TreeStore whilst also
|
||||
providing convenience functions to access gtk.TreeModel subclasses which
|
||||
provide filtered views of the data.
|
||||
"""
|
||||
|
||||
(COL_NAME, COL_VER, COL_REV, COL_RNM, COL_SEC, COL_SUM, COL_RDEP, COL_RPROV, COL_SIZE, COL_RCP, COL_BINB, COL_INC, COL_FADE_INC, COL_FONT, COL_FLIST) = range(15)
|
||||
(COL_NAME, COL_VER, COL_REV, COL_RNM, COL_SEC, COL_SUM, COL_RDEP, COL_RPROV, COL_SIZE, COL_BINB, COL_INC, COL_FADE_INC, COL_FONT) = range(13)
|
||||
|
||||
__gsignals__ = {
|
||||
"package-selection-changed" : (gobject.SIGNAL_RUN_LAST,
|
||||
@@ -46,9 +45,15 @@ class PackageListModel(gtk.ListStore):
|
||||
__toolchain_required_packages__ = ["packagegroup-core-standalone-sdk-target", "packagegroup-core-standalone-sdk-target-dbg"]
|
||||
|
||||
def __init__(self):
|
||||
|
||||
self.contents = None
|
||||
self.images = None
|
||||
self.pkgs_size = 0
|
||||
self.pn_path = {}
|
||||
self.pkg_path = {}
|
||||
self.rprov_pkg = {}
|
||||
gtk.ListStore.__init__ (self,
|
||||
gobject.TYPE_STRING,
|
||||
|
||||
gtk.TreeStore.__init__ (self,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
@@ -61,23 +66,23 @@ class PackageListModel(gtk.ListStore):
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_BOOLEAN,
|
||||
gobject.TYPE_BOOLEAN,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING)
|
||||
|
||||
|
||||
"""
|
||||
Find the model path for the item_name
|
||||
Returns the path in the model or None
|
||||
"""
|
||||
def find_path_for_item(self, item_name):
|
||||
pkg = item_name
|
||||
if item_name not in self.pn_path.keys():
|
||||
if item_name not in self.pkg_path.keys():
|
||||
if item_name not in self.rprov_pkg.keys():
|
||||
return None
|
||||
pkg = self.rprov_pkg[item_name]
|
||||
if pkg not in self.pn_path.keys():
|
||||
if pkg not in self.pkg_path.keys():
|
||||
return None
|
||||
|
||||
return self.pn_path[pkg]
|
||||
return self.pkg_path[pkg]
|
||||
|
||||
def find_item_for_path(self, item_path):
|
||||
return self[item_path][self.COL_NAME]
|
||||
@@ -86,110 +91,25 @@ class PackageListModel(gtk.ListStore):
|
||||
Helper function to determine whether an item is an item specified by filter
|
||||
"""
|
||||
def tree_model_filter(self, model, it, filter):
|
||||
name = model.get_value(it, self.COL_NAME)
|
||||
|
||||
for key in filter.keys():
|
||||
if key == self.COL_NAME:
|
||||
if filter[key] != 'Search packages by name':
|
||||
if name and filter[key] not in name:
|
||||
return False
|
||||
else:
|
||||
if model.get_value(it, key) not in filter[key]:
|
||||
return False
|
||||
self.filtered_nb += 1
|
||||
if model.get_value(it, key) not in filter[key]:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
"""
|
||||
Create, if required, and return a filtered gtk.TreeModelSort
|
||||
containing only the items specified by filter
|
||||
"""
|
||||
def tree_model(self, filter, excluded_items_ahead=False, included_items_ahead=False, search_data=None, initial=False):
|
||||
def tree_model(self, filter):
|
||||
model = self.filter_new()
|
||||
self.filtered_nb = 0
|
||||
model.set_visible_func(self.tree_model_filter, filter)
|
||||
|
||||
sort = gtk.TreeModelSort(model)
|
||||
if initial:
|
||||
sort.set_sort_column_id(PackageListModel.COL_NAME, gtk.SORT_ASCENDING)
|
||||
sort.set_default_sort_func(None)
|
||||
|
||||
if excluded_items_ahead:
|
||||
sort.set_default_sort_func(self.exclude_item_sort_func, search_data)
|
||||
elif included_items_ahead:
|
||||
sort.set_default_sort_func(self.include_item_sort_func, search_data)
|
||||
else:
|
||||
if search_data and search_data!='Search recipes by name' and search_data!='Search package groups by name':
|
||||
sort.set_default_sort_func(self.sort_func, search_data)
|
||||
else:
|
||||
sort.set_sort_column_id(PackageListModel.COL_NAME, gtk.SORT_ASCENDING)
|
||||
sort.set_default_sort_func(None)
|
||||
|
||||
sort.set_sort_func(PackageListModel.COL_INC, self.sort_column, PackageListModel.COL_INC)
|
||||
sort.set_sort_func(PackageListModel.COL_SIZE, self.sort_column, PackageListModel.COL_SIZE)
|
||||
sort.set_sort_func(PackageListModel.COL_BINB, self.sort_column, PackageListModel.COL_BINB)
|
||||
sort.set_sort_func(PackageListModel.COL_RCP, self.sort_column, PackageListModel.COL_RCP)
|
||||
sort.set_sort_column_id(PackageListModel.COL_NAME, gtk.SORT_ASCENDING)
|
||||
sort.set_default_sort_func(None)
|
||||
return sort
|
||||
|
||||
def sort_column(self, model, row1, row2, col):
|
||||
value1 = model.get_value(row1, col)
|
||||
value2 = model.get_value(row2, col)
|
||||
if col==PackageListModel.COL_SIZE:
|
||||
value1 = HobPage._string_to_size(value1)
|
||||
value2 = HobPage._string_to_size(value2)
|
||||
|
||||
cmp_res = cmp(value1, value2)
|
||||
if cmp_res!=0:
|
||||
if col==PackageListModel.COL_INC:
|
||||
return -cmp_res
|
||||
else:
|
||||
return cmp_res
|
||||
else:
|
||||
name1 = model.get_value(row1, PackageListModel.COL_NAME)
|
||||
name2 = model.get_value(row2, PackageListModel.COL_NAME)
|
||||
return cmp(name1,name2)
|
||||
|
||||
def exclude_item_sort_func(self, model, iter1, iter2, user_data=None):
|
||||
if user_data:
|
||||
val1 = model.get_value(iter1, PackageListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, PackageListModel.COL_NAME)
|
||||
if val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
else:
|
||||
val1 = model.get_value(iter1, PackageListModel.COL_FADE_INC)
|
||||
val2 = model.get_value(iter2, PackageListModel.COL_INC)
|
||||
return ((val1 == True) and (val2 == False))
|
||||
|
||||
def include_item_sort_func(self, model, iter1, iter2, user_data=None):
|
||||
if user_data:
|
||||
val1 = model.get_value(iter1, PackageListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, PackageListModel.COL_NAME)
|
||||
if val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
else:
|
||||
val1 = model.get_value(iter1, PackageListModel.COL_INC)
|
||||
val2 = model.get_value(iter2, PackageListModel.COL_INC)
|
||||
return ((val1 == False) and (val2 == True))
|
||||
|
||||
def sort_func(self, model, iter1, iter2, user_data):
|
||||
val1 = model.get_value(iter1, PackageListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, PackageListModel.COL_NAME)
|
||||
if val1 is None or val2 is None:
|
||||
return 0
|
||||
elif val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
|
||||
def convert_vpath_to_path(self, view_model, view_path):
|
||||
# view_model is the model sorted
|
||||
# get the path of the model filtered
|
||||
@@ -201,13 +121,16 @@ class PackageListModel(gtk.ListStore):
|
||||
return path
|
||||
|
||||
def convert_path_to_vpath(self, view_model, path):
|
||||
name = self.find_item_for_path(path)
|
||||
it = view_model.get_iter_first()
|
||||
while it:
|
||||
name = self.find_item_for_path(path)
|
||||
view_name = view_model.get_value(it, PackageListModel.COL_NAME)
|
||||
if view_name == name:
|
||||
view_path = view_model.get_path(it)
|
||||
return view_path
|
||||
child_it = view_model.iter_children(it)
|
||||
while child_it:
|
||||
view_name = view_model.get_value(child_it, self.COL_NAME)
|
||||
if view_name == name:
|
||||
view_path = view_model.get_path(child_it)
|
||||
return view_path
|
||||
child_it = view_model.iter_next(child_it)
|
||||
it = view_model.iter_next(it)
|
||||
return None
|
||||
|
||||
@@ -216,8 +139,11 @@ class PackageListModel(gtk.ListStore):
|
||||
bb.event.PackageInfo event and populates the package list.
|
||||
"""
|
||||
def populate(self, pkginfolist):
|
||||
# First clear the model, in case repopulating
|
||||
self.clear()
|
||||
self.pkgs_size = 0
|
||||
self.pn_path = {}
|
||||
self.pkg_path = {}
|
||||
self.rprov_pkg = {}
|
||||
|
||||
def getpkgvalue(pkgdict, key, pkgname, defaultval = None):
|
||||
value = pkgdict.get('%s_%s' % (key, pkgname), None)
|
||||
@@ -229,6 +155,15 @@ class PackageListModel(gtk.ListStore):
|
||||
pn = pkginfo['PN']
|
||||
pv = pkginfo['PV']
|
||||
pr = pkginfo['PR']
|
||||
if pn in self.pn_path.keys():
|
||||
pniter = self.get_iter(self.pn_path[pn])
|
||||
else:
|
||||
pniter = self.append(None)
|
||||
self.set(pniter, self.COL_NAME, pn + '-' + pv + '-' + pr,
|
||||
self.COL_INC, False)
|
||||
self.pn_path[pn] = self.get_path(pniter)
|
||||
|
||||
# PKG is always present
|
||||
pkg = pkginfo['PKG']
|
||||
pkgv = getpkgvalue(pkginfo, 'PKGV', pkg)
|
||||
pkgr = getpkgvalue(pkginfo, 'PKGR', pkg)
|
||||
@@ -242,12 +177,9 @@ class PackageListModel(gtk.ListStore):
|
||||
rdep = getpkgvalue(pkginfo, 'RDEPENDS', pkg, "")
|
||||
rrec = getpkgvalue(pkginfo, 'RRECOMMENDS', pkg, "")
|
||||
rprov = getpkgvalue(pkginfo, 'RPROVIDES', pkg, "")
|
||||
files_list = getpkgvalue(pkginfo, 'FILES_INFO', pkg, "")
|
||||
for i in rprov.split():
|
||||
self.rprov_pkg[i] = pkg
|
||||
|
||||
recipe = pn + '-' + pv + '-' + pr
|
||||
|
||||
allow_empty = getpkgvalue(pkginfo, 'ALLOW_EMPTY', pkg, "")
|
||||
|
||||
if pkgsize == "0" and not allow_empty:
|
||||
@@ -255,27 +187,15 @@ class PackageListModel(gtk.ListStore):
|
||||
|
||||
# pkgsize is in KB
|
||||
size = HobPage._size_to_string(HobPage._string_to_size(pkgsize + ' KB'))
|
||||
self.set(self.append(), self.COL_NAME, pkg, self.COL_VER, pkgv,
|
||||
|
||||
it = self.append(pniter)
|
||||
self.pkg_path[pkg] = self.get_path(it)
|
||||
self.set(it, self.COL_NAME, pkg, self.COL_VER, pkgv,
|
||||
self.COL_REV, pkgr, self.COL_RNM, pkg_rename,
|
||||
self.COL_SEC, section, self.COL_SUM, summary,
|
||||
self.COL_RDEP, rdep + ' ' + rrec,
|
||||
self.COL_RPROV, rprov, self.COL_SIZE, size,
|
||||
self.COL_RCP, recipe, self.COL_BINB, "",
|
||||
self.COL_INC, False, self.COL_FONT, '10', self.COL_FLIST, files_list)
|
||||
|
||||
self.pn_path = {}
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
pn = self.get_value(it, self.COL_NAME)
|
||||
path = self.get_path(it)
|
||||
self.pn_path[pn] = path
|
||||
it = self.iter_next(it)
|
||||
|
||||
"""
|
||||
Update the model, send out the notification.
|
||||
"""
|
||||
def selection_change_notification(self):
|
||||
self.emit("package-selection-changed")
|
||||
self.COL_BINB, "", self.COL_INC, False, self.COL_FONT, '10')
|
||||
|
||||
"""
|
||||
Check whether the item at item_path is included or not
|
||||
@@ -284,26 +204,54 @@ class PackageListModel(gtk.ListStore):
|
||||
return self[item_path][self.COL_INC]
|
||||
|
||||
"""
|
||||
Add this item, and any of its dependencies, to the image contents
|
||||
Update the model, send out the notification.
|
||||
"""
|
||||
def selection_change_notification(self):
|
||||
self.emit("package-selection-changed")
|
||||
|
||||
"""
|
||||
Mark a certain package as selected.
|
||||
All its dependencies are marked as selected.
|
||||
The recipe provides the package is marked as selected.
|
||||
If user explicitly selects a recipe, all its providing packages are selected
|
||||
"""
|
||||
def include_item(self, item_path, binb=""):
|
||||
if self.path_included(item_path):
|
||||
return
|
||||
|
||||
item_name = self[item_path][self.COL_NAME]
|
||||
item_deps = self[item_path][self.COL_RDEP]
|
||||
item_rdep = self[item_path][self.COL_RDEP]
|
||||
|
||||
self[item_path][self.COL_INC] = True
|
||||
|
||||
it = self.get_iter(item_path)
|
||||
|
||||
# If user explicitly selects a recipe, all its providing packages are selected.
|
||||
if not self[item_path][self.COL_VER] and binb == "User Selected":
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
child_path = self.get_path(child_it)
|
||||
child_included = self.path_included(child_path)
|
||||
if not child_included:
|
||||
self.include_item(child_path, binb="User Selected")
|
||||
child_it = self.iter_next(child_it)
|
||||
return
|
||||
|
||||
# The recipe provides the package is also marked as selected
|
||||
parent_it = self.iter_parent(it)
|
||||
if parent_it:
|
||||
parent_path = self.get_path(parent_it)
|
||||
self[parent_path][self.COL_INC] = True
|
||||
|
||||
item_bin = self[item_path][self.COL_BINB].split(', ')
|
||||
if binb and not binb in item_bin:
|
||||
item_bin.append(binb)
|
||||
self[item_path][self.COL_BINB] = ', '.join(item_bin).lstrip(', ')
|
||||
|
||||
if item_deps:
|
||||
if item_rdep:
|
||||
# Ensure all of the items deps are included and, where appropriate,
|
||||
# add this item to their COL_BINB
|
||||
for dep in item_deps.split(" "):
|
||||
for dep in item_rdep.split(" "):
|
||||
if dep.startswith('('):
|
||||
continue
|
||||
# If the contents model doesn't already contain dep, add it
|
||||
@@ -322,6 +270,12 @@ class PackageListModel(gtk.ListStore):
|
||||
elif not dep_included:
|
||||
self.include_item(dep_path, binb=item_name)
|
||||
|
||||
"""
|
||||
Mark a certain package as de-selected.
|
||||
All other packages that depends on this package are marked as de-selected.
|
||||
If none of the packages provided by the recipe, the recipe should be marked as de-selected.
|
||||
If user explicitly de-select a recipe, all its providing packages are de-selected.
|
||||
"""
|
||||
def exclude_item(self, item_path):
|
||||
if not self.path_included(item_path):
|
||||
return
|
||||
@@ -329,9 +283,37 @@ class PackageListModel(gtk.ListStore):
|
||||
self[item_path][self.COL_INC] = False
|
||||
|
||||
item_name = self[item_path][self.COL_NAME]
|
||||
item_deps = self[item_path][self.COL_RDEP]
|
||||
if item_deps:
|
||||
for dep in item_deps.split(" "):
|
||||
item_rdep = self[item_path][self.COL_RDEP]
|
||||
it = self.get_iter(item_path)
|
||||
|
||||
# If user explicitly de-select a recipe, all its providing packages are de-selected.
|
||||
if not self[item_path][self.COL_VER]:
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
child_path = self.get_path(child_it)
|
||||
child_included = self[child_path][self.COL_INC]
|
||||
if child_included:
|
||||
self.exclude_item(child_path)
|
||||
child_it = self.iter_next(child_it)
|
||||
return
|
||||
|
||||
# If none of the packages provided by the recipe, the recipe should be marked as de-selected.
|
||||
parent_it = self.iter_parent(it)
|
||||
peer_iter = self.iter_children(parent_it)
|
||||
enabled = 0
|
||||
while peer_iter:
|
||||
peer_path = self.get_path(peer_iter)
|
||||
if self[peer_path][self.COL_INC]:
|
||||
enabled = 1
|
||||
break
|
||||
peer_iter = self.iter_next(peer_iter)
|
||||
if not enabled:
|
||||
parent_path = self.get_path(parent_it)
|
||||
self[parent_path][self.COL_INC] = False
|
||||
|
||||
# All packages that depends on this package are de-selected.
|
||||
if item_rdep:
|
||||
for dep in item_rdep.split(" "):
|
||||
if dep.startswith('('):
|
||||
continue
|
||||
dep_path = self.find_path_for_item(dep)
|
||||
@@ -351,40 +333,51 @@ class PackageListModel(gtk.ListStore):
|
||||
self.exclude_item(binb_path)
|
||||
|
||||
"""
|
||||
Empty self.contents by setting the include of each entry to None
|
||||
Package model may be incomplete, therefore when calling the
|
||||
set_selected_packages(), some packages will not be set included.
|
||||
Return the un-set packages list.
|
||||
"""
|
||||
def reset(self):
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
self.set(it,
|
||||
self.COL_INC, False,
|
||||
self.COL_BINB, "")
|
||||
it = self.iter_next(it)
|
||||
def set_selected_packages(self, packagelist, user_selected=False):
|
||||
left = []
|
||||
binb = 'User Selected' if user_selected else ''
|
||||
for pn in packagelist:
|
||||
if pn in self.pkg_path.keys():
|
||||
path = self.pkg_path[pn]
|
||||
self.include_item(item_path=path, binb=binb)
|
||||
else:
|
||||
left.append(pn)
|
||||
|
||||
self.selection_change_notification()
|
||||
|
||||
def get_selected_packages(self):
|
||||
packagelist = []
|
||||
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
if self.get_value(it, self.COL_INC):
|
||||
name = self.get_value(it, self.COL_NAME)
|
||||
packagelist.append(name)
|
||||
it = self.iter_next(it)
|
||||
|
||||
return packagelist
|
||||
return left
|
||||
|
||||
def get_user_selected_packages(self):
|
||||
packagelist = []
|
||||
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
if self.get_value(it, self.COL_INC):
|
||||
binb = self.get_value(it, self.COL_BINB)
|
||||
if binb == "User Selected":
|
||||
name = self.get_value(it, self.COL_NAME)
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
if self.get_value(child_it, self.COL_INC):
|
||||
binb = self.get_value(child_it, self.COL_BINB)
|
||||
if binb == "User Selected":
|
||||
name = self.get_value(child_it, self.COL_NAME)
|
||||
packagelist.append(name)
|
||||
child_it = self.iter_next(child_it)
|
||||
it = self.iter_next(it)
|
||||
|
||||
return packagelist
|
||||
|
||||
def get_selected_packages(self):
|
||||
packagelist = []
|
||||
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
if self.get_value(child_it, self.COL_INC):
|
||||
name = self.get_value(child_it, self.COL_NAME)
|
||||
packagelist.append(name)
|
||||
child_it = self.iter_next(child_it)
|
||||
it = self.iter_next(it)
|
||||
|
||||
return packagelist
|
||||
@@ -395,31 +388,16 @@ class PackageListModel(gtk.ListStore):
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
if self.get_value(it, self.COL_INC):
|
||||
name = self.get_value(it, self.COL_NAME)
|
||||
if name.endswith("-dev") or name.endswith("-dbg"):
|
||||
packagelist.append(name)
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
name = self.get_value(child_it, self.COL_NAME)
|
||||
inc = self.get_value(child_it, self.COL_INC)
|
||||
if inc or name.endswith("-dev") or name.endswith("-dbg"):
|
||||
packagelist.append(name)
|
||||
child_it = self.iter_next(child_it)
|
||||
it = self.iter_next(it)
|
||||
|
||||
return list(set(packagelist + self.__toolchain_required_packages__));
|
||||
|
||||
"""
|
||||
Package model may be incomplete, therefore when calling the
|
||||
set_selected_packages(), some packages will not be set included.
|
||||
Return the un-set packages list.
|
||||
"""
|
||||
def set_selected_packages(self, packagelist, user_selected=False):
|
||||
left = []
|
||||
binb = 'User Selected' if user_selected else ''
|
||||
for pn in packagelist:
|
||||
if pn in self.pn_path.keys():
|
||||
path = self.pn_path[pn]
|
||||
self.include_item(item_path=path, binb=binb)
|
||||
else:
|
||||
left.append(pn)
|
||||
|
||||
self.selection_change_notification()
|
||||
return left
|
||||
|
||||
"""
|
||||
Return the selected package size, unit is B.
|
||||
"""
|
||||
@@ -427,16 +405,37 @@ class PackageListModel(gtk.ListStore):
|
||||
packages_size = 0
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
if self.get_value(it, self.COL_INC):
|
||||
str_size = self.get_value(it, self.COL_SIZE)
|
||||
if not str_size:
|
||||
continue
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
if self.get_value(child_it, self.COL_INC):
|
||||
str_size = self.get_value(child_it, self.COL_SIZE)
|
||||
if not str_size:
|
||||
continue
|
||||
|
||||
packages_size += HobPage._string_to_size(str_size)
|
||||
packages_size += HobPage._string_to_size(str_size)
|
||||
|
||||
child_it = self.iter_next(child_it)
|
||||
it = self.iter_next(it)
|
||||
return packages_size
|
||||
|
||||
"""
|
||||
Empty self.contents by setting the include of each entry to None
|
||||
"""
|
||||
def reset(self):
|
||||
self.pkgs_size = 0
|
||||
it = self.get_iter_first()
|
||||
while it:
|
||||
self.set(it, self.COL_INC, False)
|
||||
child_it = self.iter_children(it)
|
||||
while child_it:
|
||||
self.set(child_it,
|
||||
self.COL_INC, False,
|
||||
self.COL_BINB, "")
|
||||
child_it = self.iter_next(child_it)
|
||||
it = self.iter_next(it)
|
||||
|
||||
self.selection_change_notification()
|
||||
|
||||
"""
|
||||
Resync the state of included items to a backup column before performing the fadeout visible effect
|
||||
"""
|
||||
@@ -445,6 +444,9 @@ class PackageListModel(gtk.ListStore):
|
||||
while it:
|
||||
active = self.get_value(it, self.COL_INC)
|
||||
self.set(it, self.COL_FADE_INC, active)
|
||||
if self.iter_has_child(it):
|
||||
self.resync_fadeout_column(self.iter_children(it))
|
||||
|
||||
it = self.iter_next(it)
|
||||
|
||||
#
|
||||
@@ -457,8 +459,7 @@ class RecipeListModel(gtk.ListStore):
|
||||
providing convenience functions to access gtk.TreeModel subclasses which
|
||||
provide filtered views of the data.
|
||||
"""
|
||||
(COL_NAME, COL_DESC, COL_LIC, COL_GROUP, COL_DEPS, COL_BINB, COL_TYPE, COL_INC, COL_IMG, COL_INSTALL, COL_PN, COL_FADE_INC, COL_SUMMARY, COL_VERSION,
|
||||
COL_REVISION, COL_HOMEPAGE, COL_BUGTRACKER) = range(17)
|
||||
(COL_NAME, COL_DESC, COL_LIC, COL_GROUP, COL_DEPS, COL_BINB, COL_TYPE, COL_INC, COL_IMG, COL_INSTALL, COL_PN, COL_FADE_INC) = range(12)
|
||||
|
||||
__custom_image__ = "Create your own image"
|
||||
|
||||
@@ -483,12 +484,7 @@ class RecipeListModel(gtk.ListStore):
|
||||
gobject.TYPE_BOOLEAN,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_BOOLEAN,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING,
|
||||
gobject.TYPE_STRING)
|
||||
gobject.TYPE_BOOLEAN)
|
||||
|
||||
"""
|
||||
Find the model path for the item_name
|
||||
@@ -520,104 +516,39 @@ class RecipeListModel(gtk.ListStore):
|
||||
return False
|
||||
|
||||
for key in filter.keys():
|
||||
if key == self.COL_NAME:
|
||||
if filter[key] != 'Search recipes by name' and filter[key] != 'Search package groups by name':
|
||||
if filter[key] not in name:
|
||||
return False
|
||||
else:
|
||||
if model.get_value(it, key) not in filter[key]:
|
||||
return False
|
||||
self.filtered_nb += 1
|
||||
if model.get_value(it, key) not in filter[key]:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def exclude_item_sort_func(self, model, iter1, iter2, user_data=None):
|
||||
if user_data:
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
|
||||
if val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
else:
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_FADE_INC)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_INC)
|
||||
return ((val1 == True) and (val2 == False))
|
||||
def exclude_item_sort_func(self, model, iter1, iter2):
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_FADE_INC)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_INC)
|
||||
return ((val1 == True) and (val2 == False))
|
||||
|
||||
def include_item_sort_func(self, model, iter1, iter2, user_data=None):
|
||||
if user_data:
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
|
||||
if val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
else:
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_INC)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_INC)
|
||||
return ((val1 == False) and (val2 == True))
|
||||
|
||||
def sort_func(self, model, iter1, iter2, user_data):
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
|
||||
if val1 is None or val2 is None:
|
||||
return 0
|
||||
elif val1.startswith(user_data) and not val2.startswith(user_data):
|
||||
return -1
|
||||
elif not val1.startswith(user_data) and val2.startswith(user_data):
|
||||
return 1
|
||||
else:
|
||||
return 0
|
||||
def include_item_sort_func(self, model, iter1, iter2):
|
||||
val1 = model.get_value(iter1, RecipeListModel.COL_INC)
|
||||
val2 = model.get_value(iter2, RecipeListModel.COL_INC)
|
||||
return ((val1 == False) and (val2 == True))
|
||||
|
||||
"""
|
||||
Create, if required, and return a filtered gtk.TreeModelSort
|
||||
containing only the items specified by filter
|
||||
containing only the items which are items specified by filter
|
||||
"""
|
||||
def tree_model(self, filter, excluded_items_ahead=False, included_items_ahead=False, search_data=None, initial=False):
|
||||
def tree_model(self, filter, excluded_items_ahead=False, included_items_ahead=True):
|
||||
model = self.filter_new()
|
||||
self.filtered_nb = 0
|
||||
model.set_visible_func(self.tree_model_filter, filter)
|
||||
|
||||
sort = gtk.TreeModelSort(model)
|
||||
if initial:
|
||||
if excluded_items_ahead:
|
||||
sort.set_default_sort_func(self.exclude_item_sort_func)
|
||||
elif included_items_ahead:
|
||||
sort.set_default_sort_func(self.include_item_sort_func)
|
||||
else:
|
||||
sort.set_sort_column_id(RecipeListModel.COL_NAME, gtk.SORT_ASCENDING)
|
||||
sort.set_default_sort_func(None)
|
||||
|
||||
if excluded_items_ahead:
|
||||
sort.set_default_sort_func(self.exclude_item_sort_func, search_data)
|
||||
elif included_items_ahead:
|
||||
sort.set_default_sort_func(self.include_item_sort_func, search_data)
|
||||
else:
|
||||
if search_data and search_data!='Search recipes by name' and search_data!='Search package groups by name':
|
||||
sort.set_default_sort_func(self.sort_func, search_data)
|
||||
else:
|
||||
sort.set_sort_column_id(RecipeListModel.COL_NAME, gtk.SORT_ASCENDING)
|
||||
sort.set_default_sort_func(None)
|
||||
|
||||
sort.set_sort_func(RecipeListModel.COL_INC, self.sort_column, RecipeListModel.COL_INC)
|
||||
sort.set_sort_func(RecipeListModel.COL_GROUP, self.sort_column, RecipeListModel.COL_GROUP)
|
||||
sort.set_sort_func(RecipeListModel.COL_BINB, self.sort_column, RecipeListModel.COL_BINB)
|
||||
sort.set_sort_func(RecipeListModel.COL_LIC, self.sort_column, RecipeListModel.COL_LIC)
|
||||
return sort
|
||||
|
||||
def sort_column(self, model, row1, row2, col):
|
||||
value1 = model.get_value(row1, col)
|
||||
value2 = model.get_value(row2, col)
|
||||
cmp_res = cmp(value1, value2)
|
||||
if cmp_res!=0:
|
||||
if col==RecipeListModel.COL_INC:
|
||||
return -cmp_res
|
||||
else:
|
||||
return cmp_res
|
||||
else:
|
||||
name1 = model.get_value(row1, RecipeListModel.COL_NAME)
|
||||
name2 = model.get_value(row2, RecipeListModel.COL_NAME)
|
||||
return cmp(name1,name2)
|
||||
|
||||
def convert_vpath_to_path(self, view_model, view_path):
|
||||
filtered_model_path = view_model.convert_path_to_child_path(view_path)
|
||||
filtered_model = view_model.get_model()
|
||||
@@ -652,9 +583,7 @@ class RecipeListModel(gtk.ListStore):
|
||||
self.COL_LIC, "", self.COL_GROUP, "",
|
||||
self.COL_DEPS, "", self.COL_BINB, "",
|
||||
self.COL_TYPE, "image", self.COL_INC, False,
|
||||
self.COL_IMG, False, self.COL_INSTALL, "", self.COL_PN, self.__custom_image__,
|
||||
self.COL_SUMMARY, "", self.COL_VERSION, "", self.COL_REVISION, "",
|
||||
self.COL_HOMEPAGE, "", self.COL_BUGTRACKER, "")
|
||||
self.COL_IMG, False, self.COL_INSTALL, "", self.COL_PN, self.__custom_image__)
|
||||
|
||||
for item in event_model["pn"]:
|
||||
name = item
|
||||
@@ -662,11 +591,6 @@ class RecipeListModel(gtk.ListStore):
|
||||
lic = event_model["pn"][item]["license"]
|
||||
group = event_model["pn"][item]["section"]
|
||||
inherits = event_model["pn"][item]["inherits"]
|
||||
summary = event_model["pn"][item]["summary"]
|
||||
version = event_model["pn"][item]["version"]
|
||||
revision = event_model["pn"][item]["revision"]
|
||||
homepage = event_model["pn"][item]["homepage"]
|
||||
bugtracker = event_model["pn"][item]["bugtracker"]
|
||||
install = []
|
||||
|
||||
depends = event_model["depends"].get(item, []) + event_model["rdepends-pn"].get(item, [])
|
||||
@@ -688,9 +612,7 @@ class RecipeListModel(gtk.ListStore):
|
||||
self.COL_LIC, lic, self.COL_GROUP, group,
|
||||
self.COL_DEPS, " ".join(depends), self.COL_BINB, "",
|
||||
self.COL_TYPE, atype, self.COL_INC, False,
|
||||
self.COL_IMG, False, self.COL_INSTALL, " ".join(install), self.COL_PN, item,
|
||||
self.COL_SUMMARY, summary, self.COL_VERSION, version, self.COL_REVISION, revision,
|
||||
self.COL_HOMEPAGE, homepage, self.COL_BUGTRACKER, bugtracker)
|
||||
self.COL_IMG, False, self.COL_INSTALL, " ".join(install), self.COL_PN, item)
|
||||
|
||||
self.pn_path = {}
|
||||
it = self.get_iter_first()
|
||||
@@ -844,9 +766,3 @@ class RecipeListModel(gtk.ListStore):
|
||||
binb="User Selected",
|
||||
image_contents=True)
|
||||
self.selection_change_notification()
|
||||
|
||||
def set_custom_image_version(self, version):
|
||||
self.custom_image_version = version
|
||||
|
||||
def get_custom_image_version(self):
|
||||
return self.custom_image_version
|
||||
|
||||
@@ -44,6 +44,8 @@ class hic:
|
||||
ICON_PACKAGES_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('packages/packages_hover.png'))
|
||||
ICON_LAYERS_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('layers/layers_display.png'))
|
||||
ICON_LAYERS_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('layers/layers_hover.png'))
|
||||
ICON_TEMPLATES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('templates/templates_display.png'))
|
||||
ICON_TEMPLATES_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('templates/templates_hover.png'))
|
||||
ICON_IMAGES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('images/images_display.png'))
|
||||
ICON_IMAGES_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('images/images_hover.png'))
|
||||
ICON_SETTINGS_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('settings/settings_display.png'))
|
||||
@@ -83,29 +85,23 @@ class HobViewTable (gtk.VBox):
|
||||
gobject.TYPE_PYOBJECT,)),
|
||||
}
|
||||
|
||||
def __init__(self, columns, name):
|
||||
def __init__(self, columns):
|
||||
gtk.VBox.__init__(self, False, 6)
|
||||
self.table_tree = gtk.TreeView()
|
||||
self.table_tree.set_headers_visible(True)
|
||||
self.table_tree.set_headers_clickable(True)
|
||||
self.table_tree.set_enable_search(True)
|
||||
self.table_tree.set_rules_hint(True)
|
||||
self.table_tree.set_enable_tree_lines(True)
|
||||
self.table_tree.get_selection().set_mode(gtk.SELECTION_SINGLE)
|
||||
self.toggle_columns = []
|
||||
self.table_tree.connect("row-activated", self.row_activated_cb)
|
||||
self.top_bar = None
|
||||
self.tab_name = name
|
||||
|
||||
for i, column in enumerate(columns):
|
||||
col_name = column['col_name']
|
||||
col = gtk.TreeViewColumn(col_name)
|
||||
col = gtk.TreeViewColumn(column['col_name'])
|
||||
col.set_clickable(True)
|
||||
col.set_resizable(True)
|
||||
if self.tab_name.startswith('Included'):
|
||||
if col_name!='Included':
|
||||
col.set_sort_column_id(column['col_id'])
|
||||
else:
|
||||
col.set_sort_column_id(column['col_id'])
|
||||
col.set_sort_column_id(column['col_id'])
|
||||
if 'col_min' in column.keys():
|
||||
col.set_min_width(column['col_min'])
|
||||
if 'col_max' in column.keys():
|
||||
@@ -128,7 +124,7 @@ class HobViewTable (gtk.VBox):
|
||||
self.toggle_id = i
|
||||
col.pack_end(cell, True)
|
||||
col.set_attributes(cell, active=column['col_id'])
|
||||
self.toggle_columns.append(col_name)
|
||||
self.toggle_columns.append(column['col_name'])
|
||||
if 'col_group' in column.keys():
|
||||
col.set_cell_data_func(cell, self.set_group_number_cb)
|
||||
elif column['col_style'] == 'radio toggle':
|
||||
@@ -139,7 +135,7 @@ class HobViewTable (gtk.VBox):
|
||||
self.toggle_id = i
|
||||
col.pack_end(cell, True)
|
||||
col.set_attributes(cell, active=column['col_id'])
|
||||
self.toggle_columns.append(col_name)
|
||||
self.toggle_columns.append(column['col_name'])
|
||||
elif column['col_style'] == 'binb':
|
||||
cell = gtk.CellRendererText()
|
||||
col.pack_start(cell, True)
|
||||
@@ -147,42 +143,10 @@ class HobViewTable (gtk.VBox):
|
||||
if 'col_t_id' in column.keys():
|
||||
col.add_attribute(cell, 'font', column['col_t_id'])
|
||||
|
||||
self.scroll = gtk.ScrolledWindow()
|
||||
self.scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
|
||||
self.scroll.add(self.table_tree)
|
||||
|
||||
self.pack_end(self.scroll, True, True, 0)
|
||||
|
||||
def add_no_result_bar(self, entry):
|
||||
color = HobColors.KHAKI
|
||||
self.top_bar = gtk.EventBox()
|
||||
self.top_bar.set_size_request(-1, 70)
|
||||
self.top_bar.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
|
||||
self.top_bar.set_flags(gtk.CAN_DEFAULT)
|
||||
self.top_bar.grab_default()
|
||||
|
||||
no_result_tab = gtk.Table(5, 20, True)
|
||||
self.top_bar.add(no_result_tab)
|
||||
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0.0, 0.5)
|
||||
title = "No results matching your search"
|
||||
label.set_markup("<span size='x-large'><b>%s</b></span>" % title)
|
||||
no_result_tab.attach(label, 1, 14, 1, 4)
|
||||
|
||||
clear_button = HobButton("Clear search")
|
||||
clear_button.set_tooltip_text("Clear search query")
|
||||
clear_button.connect('clicked', self.set_search_entry_clear_cb, entry)
|
||||
no_result_tab.attach(clear_button, 16, 19, 1, 4)
|
||||
|
||||
self.pack_start(self.top_bar, False, True, 12)
|
||||
self.top_bar.show_all()
|
||||
|
||||
def set_search_entry_clear_cb(self, button, search):
|
||||
if search.get_editable() == True:
|
||||
search.set_text("")
|
||||
search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
search.grab_focus()
|
||||
scroll = gtk.ScrolledWindow()
|
||||
scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_ALWAYS)
|
||||
scroll.add(self.table_tree)
|
||||
self.pack_start(scroll, True, True, 0)
|
||||
|
||||
def display_binb_cb(self, col, cell, model, it, col_id):
|
||||
binb = model.get_value(it, col_id)
|
||||
@@ -191,15 +155,9 @@ class HobViewTable (gtk.VBox):
|
||||
bin = binb.split(', ')
|
||||
total_no = len(bin)
|
||||
if total_no > 1 and bin[0] == "User Selected":
|
||||
if total_no > 2:
|
||||
present_binb = bin[1] + ' (+' + str(total_no - 1) + ')'
|
||||
else:
|
||||
present_binb = bin[1]
|
||||
present_binb = bin[1] + ' (+' + str(total_no) + ')'
|
||||
else:
|
||||
if total_no > 1:
|
||||
present_binb = bin[0] + ' (+' + str(total_no - 1) + ')'
|
||||
else:
|
||||
present_binb = bin[0]
|
||||
present_binb = bin[0] + ' (+' + str(total_no) + ')'
|
||||
cell.set_property('text', present_binb)
|
||||
else:
|
||||
cell.set_property('text', "")
|
||||
@@ -208,6 +166,10 @@ class HobViewTable (gtk.VBox):
|
||||
def set_model(self, tree_model):
|
||||
self.table_tree.set_model(tree_model)
|
||||
|
||||
def set_search_entry(self, search_column_id, entry):
|
||||
self.table_tree.set_search_column(search_column_id)
|
||||
self.table_tree.set_search_entry(entry)
|
||||
|
||||
def toggle_default(self):
|
||||
model = self.table_tree.get_model()
|
||||
if not model:
|
||||
@@ -379,13 +341,18 @@ class HobInfoButton(gtk.EventBox):
|
||||
hic.ICON_INFO_DISPLAY_FILE)
|
||||
self.image.show()
|
||||
self.add(self.image)
|
||||
self.tip_markup = tip_markup
|
||||
self.my_parent = parent
|
||||
|
||||
self.set_events(gtk.gdk.BUTTON_RELEASE |
|
||||
gtk.gdk.ENTER_NOTIFY_MASK |
|
||||
gtk.gdk.LEAVE_NOTIFY_MASK)
|
||||
|
||||
self.ptip = PersistentTooltip(tip_markup)
|
||||
|
||||
if parent:
|
||||
self.ptip.set_parent(parent)
|
||||
self.ptip.set_transient_for(parent)
|
||||
self.ptip.set_destroy_with_parent(True)
|
||||
|
||||
self.connect("button-release-event", self.button_release_cb)
|
||||
self.connect("enter-notify-event", self.mouse_in_cb)
|
||||
self.connect("leave-notify-event", self.mouse_out_cb)
|
||||
@@ -395,18 +362,7 @@ class HobInfoButton(gtk.EventBox):
|
||||
PersistentTooltip
|
||||
"""
|
||||
def button_release_cb(self, widget, event):
|
||||
from bb.ui.crumbs.hig.propertydialog import PropertyDialog
|
||||
self.dialog = PropertyDialog(title = '',
|
||||
parent = self.my_parent,
|
||||
information = self.tip_markup,
|
||||
flags = gtk.DIALOG_DESTROY_WITH_PARENT
|
||||
| gtk.DIALOG_NO_SEPARATOR)
|
||||
|
||||
button = self.dialog.add_button("Close", gtk.RESPONSE_CANCEL)
|
||||
HobAltButton.style_button(button)
|
||||
button.connect("clicked", lambda w: self.dialog.destroy())
|
||||
self.dialog.show_all()
|
||||
self.dialog.run()
|
||||
self.ptip.show()
|
||||
|
||||
"""
|
||||
Change to the prelight image when the mouse enters the widget
|
||||
@@ -487,8 +443,7 @@ class HobNotebook(gtk.Notebook):
|
||||
self.pages = []
|
||||
|
||||
self.search = None
|
||||
self.search_focus = False
|
||||
self.page_changed = False
|
||||
self.search_name = ""
|
||||
|
||||
self.connect("switch-page", self.page_changed_cb)
|
||||
|
||||
@@ -501,10 +456,6 @@ class HobNotebook(gtk.Notebook):
|
||||
else:
|
||||
lbl.set_active(False)
|
||||
|
||||
if self.search:
|
||||
self.page_changed = True
|
||||
self.reset_entry(self.search, page_num)
|
||||
|
||||
def append_page(self, child, tab_label, tab_tooltip=None):
|
||||
label = HobTabLabel(tab_label)
|
||||
if tab_tooltip:
|
||||
@@ -513,22 +464,16 @@ class HobNotebook(gtk.Notebook):
|
||||
self.pages.append(label)
|
||||
gtk.Notebook.append_page(self, child, label)
|
||||
|
||||
def set_entry(self, names, tips):
|
||||
def set_entry(self, name="Search:"):
|
||||
self.search = gtk.Entry()
|
||||
self.search_names = names
|
||||
self.search_tips = tips
|
||||
self.search_name = name
|
||||
style = self.search.get_style()
|
||||
style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.GRAY, False, False)
|
||||
self.search.set_style(style)
|
||||
self.search.set_text(names[0])
|
||||
self.search.set_tooltip_text(self.search_tips[0])
|
||||
self.search.props.has_tooltip = True
|
||||
|
||||
self.search.set_text(name)
|
||||
self.search.set_editable(False)
|
||||
self.search.set_icon_from_stock(gtk.ENTRY_ICON_SECONDARY, gtk.STOCK_CLEAR)
|
||||
self.search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
self.search.connect("icon-release", self.set_search_entry_clear_cb)
|
||||
self.search.set_width_chars(30)
|
||||
self.search.show()
|
||||
|
||||
self.search.connect("focus-in-event", self.set_search_entry_editable_cb)
|
||||
@@ -546,42 +491,31 @@ class HobNotebook(gtk.Notebook):
|
||||
child.set_count(0)
|
||||
|
||||
def set_search_entry_editable_cb(self, search, event):
|
||||
self.search_focus = True
|
||||
search.set_editable(True)
|
||||
text = search.get_text()
|
||||
if text in self.search_names:
|
||||
search.set_text("")
|
||||
search.set_text("")
|
||||
style = self.search.get_style()
|
||||
style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.BLACK, False, False)
|
||||
search.set_style(style)
|
||||
|
||||
def set_search_entry_reset_cb(self, search, event):
|
||||
page_num = self.get_current_page()
|
||||
text = search.get_text()
|
||||
if not text:
|
||||
self.reset_entry(search, page_num)
|
||||
|
||||
def reset_entry(self, entry, page_num):
|
||||
def reset_entry(self, entry):
|
||||
style = entry.get_style()
|
||||
style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.GRAY, False, False)
|
||||
entry.set_style(style)
|
||||
entry.set_text(self.search_names[page_num])
|
||||
entry.set_tooltip_text(self.search_tips[page_num])
|
||||
entry.set_text(self.search_name)
|
||||
entry.set_editable(False)
|
||||
entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
|
||||
def set_search_entry_reset_cb(self, search, event):
|
||||
self.reset_entry(search)
|
||||
|
||||
def set_search_entry_clear_cb(self, search, icon_pos, event):
|
||||
if search.get_editable() == True:
|
||||
search.set_text("")
|
||||
search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
search.grab_focus()
|
||||
|
||||
def set_page(self, title):
|
||||
for child in self.pages:
|
||||
if child.lbl.get_label() == title:
|
||||
child.grab_focus()
|
||||
self.set_current_page(self.pages.index(child))
|
||||
return
|
||||
self.set_current_page(self.page_num(child))
|
||||
|
||||
class HobWarpCellRendererText(gtk.CellRendererText):
|
||||
def __init__(self, col_number):
|
||||
|
||||
@@ -45,8 +45,6 @@ class ImageConfigurationPage (HobPage):
|
||||
# or by manual. If by manual, all user's recipe selection and package selection are
|
||||
# cleared.
|
||||
self.machine_combo_changed_by_manual = True
|
||||
self.stopping = False
|
||||
self.warning_shift = 0
|
||||
self.create_visual_elements()
|
||||
|
||||
def create_visual_elements(self):
|
||||
@@ -55,6 +53,12 @@ class ImageConfigurationPage (HobPage):
|
||||
self.toolbar.set_orientation(gtk.ORIENTATION_HORIZONTAL)
|
||||
self.toolbar.set_style(gtk.TOOLBAR_BOTH)
|
||||
|
||||
template_button = self.append_toolbar_button(self.toolbar,
|
||||
"Templates",
|
||||
hic.ICON_TEMPLATES_DISPLAY_FILE,
|
||||
hic.ICON_TEMPLATES_HOVER_FILE,
|
||||
"Load a previously saved template",
|
||||
self.template_button_clicked_cb)
|
||||
my_images_button = self.append_toolbar_button(self.toolbar,
|
||||
"Images",
|
||||
hic.ICON_IMAGES_DISPLAY_FILE,
|
||||
@@ -110,10 +114,9 @@ class ImageConfigurationPage (HobPage):
|
||||
self.show_all()
|
||||
|
||||
def update_progress_bar(self, title, fraction, status=None):
|
||||
if self.stopping == False:
|
||||
self.progress_bar.update(fraction)
|
||||
self.progress_bar.set_text(title)
|
||||
self.progress_bar.set_rcstyle(status)
|
||||
self.progress_bar.update(fraction)
|
||||
self.progress_bar.set_title(title)
|
||||
self.progress_bar.set_rcstyle(status)
|
||||
|
||||
def show_info_populating(self):
|
||||
self._pack_components(pack_config_build_button = False)
|
||||
@@ -136,42 +139,6 @@ class ImageConfigurationPage (HobPage):
|
||||
if self.builder.recipe_model.get_selected_image() == self.builder.recipe_model.__custom_image__:
|
||||
self.just_bake_button.hide()
|
||||
|
||||
def add_warnings_bar(self):
|
||||
#create the warnings bar shown when recipes parsing generates warnings
|
||||
color = HobColors.KHAKI
|
||||
warnings_bar = gtk.EventBox()
|
||||
warnings_bar.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
|
||||
warnings_bar.set_flags(gtk.CAN_DEFAULT)
|
||||
warnings_bar.grab_default()
|
||||
|
||||
build_stop_tab = gtk.Table(10, 20, True)
|
||||
warnings_bar.add(build_stop_tab)
|
||||
|
||||
icon = gtk.Image()
|
||||
icon_pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_INDI_ALERT_FILE)
|
||||
icon.set_from_pixbuf(icon_pix_buffer)
|
||||
build_stop_tab.attach(icon, 0, 2, 0, 10)
|
||||
|
||||
label = gtk.Label()
|
||||
label.set_alignment(0.0, 0.5)
|
||||
warnings_nb = len(self.builder.parsing_warnings)
|
||||
if warnings_nb == 1:
|
||||
label.set_markup("<span size='x-large'><b>1 recipe parsing warning</b></span>")
|
||||
else:
|
||||
label.set_markup("<span size='x-large'><b>%s recipe parsing warnings</b></span>" % warnings_nb)
|
||||
build_stop_tab.attach(label, 2, 12, 0, 10)
|
||||
|
||||
view_warnings_button = HobButton("View warnings")
|
||||
view_warnings_button.connect('clicked', self.view_warnings_button_clicked_cb)
|
||||
build_stop_tab.attach(view_warnings_button, 15, 19, 1, 9)
|
||||
|
||||
return warnings_bar
|
||||
|
||||
def disable_warnings_bar(self):
|
||||
if self.builder.parsing_warnings:
|
||||
self.warnings_bar.hide_all()
|
||||
self.builder.parsing_warnings = []
|
||||
|
||||
def create_config_machine(self):
|
||||
self.machine_title = gtk.Label()
|
||||
self.machine_title.set_alignment(0.0, 0.5)
|
||||
@@ -198,7 +165,8 @@ class ImageConfigurationPage (HobPage):
|
||||
markup += "For more on layers, check the <a href=\""
|
||||
markup += "http://www.yoctoproject.org/docs/current/dev-manual/"
|
||||
markup += "dev-manual.html#understanding-and-using-layers\">reference manual</a>."
|
||||
self.layer_info_icon = HobInfoButton("<b>Layers</b>" + "*" + markup, self.get_parent())
|
||||
self.layer_info_icon = HobInfoButton(markup, self.get_parent())
|
||||
|
||||
# self.progress_box = gtk.HBox(False, 6)
|
||||
self.progress_bar = HobProgressBar()
|
||||
# self.progress_box.pack_start(self.progress_bar, expand=True, fill=True)
|
||||
@@ -217,12 +185,6 @@ class ImageConfigurationPage (HobPage):
|
||||
#self.gtable.attach(self.progress_box, 0, 40, 15, 18)
|
||||
self.gtable.attach(self.progress_bar, 0, 37, 15, 18)
|
||||
self.gtable.attach(self.stop_button, 37, 40, 15, 18, 0, 0)
|
||||
if self.builder.parsing_warnings:
|
||||
self.warnings_bar = self.add_warnings_bar()
|
||||
self.gtable.attach(self.warnings_bar, 0, 40, 14, 18)
|
||||
self.warning_shift = 4
|
||||
else:
|
||||
self.warning_shift = 0
|
||||
self.gtable.attach(self.machine_separator, 0, 40, 13, 14)
|
||||
|
||||
def create_config_baseimg(self):
|
||||
@@ -258,12 +220,12 @@ class ImageConfigurationPage (HobPage):
|
||||
self.image_separator = gtk.HSeparator()
|
||||
|
||||
def set_config_baseimg_layout(self):
|
||||
self.gtable.attach(self.image_title, 0, 40, 15+self.warning_shift, 17+self.warning_shift)
|
||||
self.gtable.attach(self.image_title_desc, 0, 40, 18+self.warning_shift, 22+self.warning_shift)
|
||||
self.gtable.attach(self.image_combo, 0, 12, 23+self.warning_shift, 26+self.warning_shift)
|
||||
self.gtable.attach(self.image_desc, 0, 12, 27+self.warning_shift, 33+self.warning_shift)
|
||||
self.gtable.attach(self.view_adv_configuration_button, 14, 36, 23+self.warning_shift, 28+self.warning_shift)
|
||||
self.gtable.attach(self.image_separator, 0, 40, 35+self.warning_shift, 36+self.warning_shift)
|
||||
self.gtable.attach(self.image_title, 0, 40, 15, 17)
|
||||
self.gtable.attach(self.image_title_desc, 0, 40, 18, 22)
|
||||
self.gtable.attach(self.image_combo, 0, 12, 23, 26)
|
||||
self.gtable.attach(self.image_desc, 0, 12, 27, 33)
|
||||
self.gtable.attach(self.view_adv_configuration_button, 14, 36, 23, 28)
|
||||
self.gtable.attach(self.image_separator, 0, 40, 35, 36)
|
||||
|
||||
def create_config_build_button(self):
|
||||
# Create the "Build packages" and "Build image" buttons at the bottom
|
||||
@@ -286,17 +248,9 @@ class ImageConfigurationPage (HobPage):
|
||||
return button_box
|
||||
|
||||
def stop_button_clicked_cb(self, button):
|
||||
self.stopping = True
|
||||
self.progress_bar.set_text("Stopping recipe parsing")
|
||||
self.progress_bar.set_rcstyle("stop")
|
||||
self.builder.cancel_parse_sync()
|
||||
|
||||
def view_warnings_button_clicked_cb(self, button):
|
||||
self.builder.show_warning_dialog()
|
||||
|
||||
def machine_combo_changed_cb(self, machine_combo):
|
||||
self.stopping = False
|
||||
self.builder.parsing_warnings = []
|
||||
combo_item = machine_combo.get_active_text()
|
||||
if not combo_item or combo_item == self.__dummy_machine__:
|
||||
return
|
||||
@@ -317,7 +271,6 @@ class ImageConfigurationPage (HobPage):
|
||||
self.builder.populate_recipe_package_info_async()
|
||||
|
||||
def update_machine_combo(self):
|
||||
self.disable_warnings_bar()
|
||||
all_machines = [self.__dummy_machine__] + self.builder.parameters.all_machines
|
||||
|
||||
model = self.machine_combo.get_model()
|
||||
@@ -327,7 +280,6 @@ class ImageConfigurationPage (HobPage):
|
||||
self.machine_combo.set_active(0)
|
||||
|
||||
def switch_machine_combo(self):
|
||||
self.disable_warnings_bar()
|
||||
self.machine_combo_changed_by_manual = False
|
||||
model = self.machine_combo.get_model()
|
||||
active = 0
|
||||
@@ -405,7 +357,7 @@ class ImageConfigurationPage (HobPage):
|
||||
image_model = recipe_model.tree_model(filter)
|
||||
image_model.set_sort_column_id(recipe_model.COL_NAME, gtk.SORT_ASCENDING)
|
||||
active = 0
|
||||
cnt = 0
|
||||
cnt = 1
|
||||
|
||||
white_pattern = []
|
||||
if self.builder.parameters.image_white_pattern:
|
||||
@@ -423,10 +375,7 @@ class ImageConfigurationPage (HobPage):
|
||||
model = self.image_combo.get_model()
|
||||
model.clear()
|
||||
# Set a indicator text to combo store when first open
|
||||
if not selected_image:
|
||||
self.image_combo.append_text(self.__dummy_image__)
|
||||
cnt = cnt + 1
|
||||
|
||||
self.image_combo.append_text(self.__dummy_image__)
|
||||
# append and set active
|
||||
while it:
|
||||
path = image_model.get_path(it)
|
||||
@@ -480,13 +429,18 @@ class ImageConfigurationPage (HobPage):
|
||||
self.builder.reparse_post_adv_settings()
|
||||
|
||||
def just_bake_button_clicked_cb(self, button):
|
||||
self.builder.parsing_warnings = []
|
||||
self.builder.just_bake()
|
||||
|
||||
def edit_image_button_clicked_cb(self, button):
|
||||
self.builder.configuration.initial_selected_image = self.builder.configuration.selected_image
|
||||
self.builder.show_recipes()
|
||||
|
||||
def template_button_clicked_cb(self, button):
|
||||
response, path = self.builder.show_load_template_dialog()
|
||||
if not response:
|
||||
return
|
||||
if path:
|
||||
self.builder.load_template(path)
|
||||
|
||||
def my_images_button_clicked_cb(self, button):
|
||||
self.builder.show_load_my_images_dialog()
|
||||
|
||||
|
||||
@@ -26,8 +26,7 @@ from bb.ui.crumbs.hobcolor import HobColors
|
||||
from bb.ui.crumbs.hobwidget import hic, HobViewTable, HobAltButton, HobButton
|
||||
from bb.ui.crumbs.hobpages import HobPage
|
||||
import subprocess
|
||||
from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
|
||||
|
||||
from bb.ui.crumbs.hig import CrumbsDialog
|
||||
#
|
||||
# ImageDetailsPage
|
||||
#
|
||||
@@ -197,6 +196,12 @@ class ImageDetailsPage (HobPage):
|
||||
self.toolbar.set_orientation(gtk.ORIENTATION_HORIZONTAL)
|
||||
self.toolbar.set_style(gtk.TOOLBAR_BOTH)
|
||||
|
||||
template_button = self.append_toolbar_button(self.toolbar,
|
||||
"Templates",
|
||||
hic.ICON_TEMPLATES_DISPLAY_FILE,
|
||||
hic.ICON_TEMPLATES_HOVER_FILE,
|
||||
"Load a previously saved template",
|
||||
self.template_button_clicked_cb)
|
||||
my_images_button = self.append_toolbar_button(self.toolbar,
|
||||
"Images",
|
||||
hic.ICON_IMAGES_DISPLAY_FILE,
|
||||
@@ -259,7 +264,11 @@ class ImageDetailsPage (HobPage):
|
||||
self.build_result = self.BuildDetailBox(varlist=varlist, vallist=vallist, icon=icon, color=color)
|
||||
self.box_group_area.pack_start(self.build_result, expand=False, fill=False)
|
||||
|
||||
self.buttonlist = ["Build new image", "Run image", "Deploy image"]
|
||||
# create the buttons at the bottom first because the buttons are used in apply_button_per_image()
|
||||
if self.build_succeeded:
|
||||
self.buttonlist = ["Build new image", "Save as template", "Run image", "Deploy image"]
|
||||
else: # get to this page from "My images"
|
||||
self.buttonlist = ["Build new image", "Run image", "Deploy image"]
|
||||
|
||||
# Name
|
||||
self.image_store = []
|
||||
@@ -395,8 +404,7 @@ class ImageDetailsPage (HobPage):
|
||||
|
||||
def open_log_clicked_cb(self, button, log_file):
|
||||
if log_file:
|
||||
log_file = "file:///" + log_file
|
||||
gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
|
||||
os.system("xdg-open /%s" % log_file)
|
||||
|
||||
def refresh_package_detail_box(self, image_size):
|
||||
self.package_detail.update_line_widgets("Total image size: ", image_size)
|
||||
@@ -572,6 +580,26 @@ class ImageDetailsPage (HobPage):
|
||||
created = True
|
||||
is_runnable = True
|
||||
|
||||
name = "Save as template"
|
||||
if name in buttonlist:
|
||||
if created == True:
|
||||
# separator
|
||||
#label = gtk.Label(" or ")
|
||||
#self.details_bottom_buttons.pack_end(label, expand=False, fill=False)
|
||||
|
||||
# create button "Save as template"
|
||||
save_button = HobAltButton("Save as template")
|
||||
else:
|
||||
save_button = HobButton("Save as template")
|
||||
#save_button.set_size_request(205, 49)
|
||||
save_button.set_flags(gtk.CAN_DEFAULT)
|
||||
packed = True
|
||||
save_button.set_tooltip_text("Save the image configuration for reuse")
|
||||
button_id = save_button.connect("clicked", self.save_button_clicked_cb)
|
||||
self.button_ids[button_id] = save_button
|
||||
self.details_bottom_buttons.pack_end(save_button, expand=False, fill=False)
|
||||
create = True
|
||||
|
||||
name = "Build new image"
|
||||
if name in buttonlist:
|
||||
# create button "Build new image"
|
||||
@@ -588,6 +616,9 @@ class ImageDetailsPage (HobPage):
|
||||
|
||||
return is_runnable
|
||||
|
||||
def save_button_clicked_cb(self, button):
|
||||
self.builder.show_save_template_dialog()
|
||||
|
||||
def deploy_button_clicked_cb(self, button):
|
||||
if self.toggled_image:
|
||||
if self.num_toggled > 1:
|
||||
@@ -615,6 +646,13 @@ class ImageDetailsPage (HobPage):
|
||||
def edit_packages_button_clicked_cb(self, button):
|
||||
self.builder.show_packages(ask=False)
|
||||
|
||||
def template_button_clicked_cb(self, button):
|
||||
response, path = self.builder.show_load_template_dialog()
|
||||
if not response:
|
||||
return
|
||||
if path:
|
||||
self.builder.load_template(path)
|
||||
|
||||
def my_images_button_clicked_cb(self, button):
|
||||
self.builder.show_load_my_images_dialog()
|
||||
|
||||
|
||||
@@ -34,12 +34,10 @@ class PackageSelectionPage (HobPage):
|
||||
|
||||
pages = [
|
||||
{
|
||||
'name' : 'Included packages',
|
||||
'tooltip' : 'The packages currently included for your image',
|
||||
'filter' : { PackageListModel.COL_INC : [True] },
|
||||
'search' : 'Search packages by name',
|
||||
'searchtip' : 'Enter a package name to find it',
|
||||
'columns' : [{
|
||||
'name' : 'Included packages',
|
||||
'tooltip' : 'The packages currently included for your image',
|
||||
'filter' : { PackageListModel.COL_INC : [True] },
|
||||
'columns' : [{
|
||||
'col_name' : 'Package name',
|
||||
'col_id' : PackageListModel.COL_NAME,
|
||||
'col_style': 'text',
|
||||
@@ -54,14 +52,7 @@ class PackageSelectionPage (HobPage):
|
||||
'col_max' : 300,
|
||||
'expand' : 'True'
|
||||
}, {
|
||||
'col_name' : 'Recipe',
|
||||
'col_id' : PackageListModel.COL_RCP,
|
||||
'col_style': 'text',
|
||||
'col_min' : 100,
|
||||
'col_max' : 250,
|
||||
'expand' : 'True'
|
||||
}, {
|
||||
'col_name' : 'Brought in by (+others)',
|
||||
'col_name' : 'Brought in by',
|
||||
'col_id' : PackageListModel.COL_BINB,
|
||||
'col_style': 'binb',
|
||||
'col_min' : 100,
|
||||
@@ -75,12 +66,10 @@ class PackageSelectionPage (HobPage):
|
||||
'col_max' : 100
|
||||
}]
|
||||
}, {
|
||||
'name' : 'All packages',
|
||||
'tooltip' : 'All packages that have been built',
|
||||
'filter' : {},
|
||||
'search' : 'Search packages by name',
|
||||
'searchtip' : 'Enter a package name to find it',
|
||||
'columns' : [{
|
||||
'name' : 'All packages',
|
||||
'tooltip' : 'All packages that have been built',
|
||||
'filter' : {},
|
||||
'columns' : [{
|
||||
'col_name' : 'Package name',
|
||||
'col_id' : PackageListModel.COL_NAME,
|
||||
'col_style': 'text',
|
||||
@@ -94,13 +83,6 @@ class PackageSelectionPage (HobPage):
|
||||
'col_min' : 100,
|
||||
'col_max' : 500,
|
||||
'expand' : 'True'
|
||||
}, {
|
||||
'col_name' : 'Recipe',
|
||||
'col_id' : PackageListModel.COL_RCP,
|
||||
'col_style': 'text',
|
||||
'col_min' : 100,
|
||||
'col_max' : 250,
|
||||
'expand' : 'True'
|
||||
}, {
|
||||
'col_name' : 'Included',
|
||||
'col_id' : PackageListModel.COL_INC,
|
||||
@@ -117,7 +99,7 @@ class PackageSelectionPage (HobPage):
|
||||
def __init__(self, builder):
|
||||
super(PackageSelectionPage, self).__init__(builder, "Edit packages")
|
||||
|
||||
# set invisible members
|
||||
# set invisiable members
|
||||
self.recipe_model = self.builder.recipe_model
|
||||
self.package_model = self.builder.package_model
|
||||
|
||||
@@ -136,31 +118,26 @@ class PackageSelectionPage (HobPage):
|
||||
# set visible members
|
||||
self.ins = HobNotebook()
|
||||
self.tables = [] # we need to modify table when the dialog is shown
|
||||
|
||||
search_names = []
|
||||
search_tips = []
|
||||
# append the tab
|
||||
for page in self.pages:
|
||||
columns = page['columns']
|
||||
name = page['name']
|
||||
tab = HobViewTable(columns, name)
|
||||
search_names.append(page['search'])
|
||||
search_tips.append(page['searchtip'])
|
||||
tab = HobViewTable(columns)
|
||||
filter = page['filter']
|
||||
sort_model = self.package_model.tree_model(filter, initial=True)
|
||||
tab.set_model(sort_model)
|
||||
tab.connect("toggled", self.table_toggled_cb, name)
|
||||
if name == "Included packages":
|
||||
tab.connect("button-release-event", self.button_click_cb)
|
||||
tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include)
|
||||
if name == "All packages":
|
||||
tab.set_model(self.package_model.tree_model(filter))
|
||||
tab.connect("toggled", self.table_toggled_cb, page['name'])
|
||||
if page['name'] == "Included packages":
|
||||
tab.connect("button-release-event", self.button_click_cb)
|
||||
tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include)
|
||||
self.ins.append_page(tab, page['name'], page['tooltip'])
|
||||
self.tables.append(tab)
|
||||
|
||||
self.ins.set_entry(search_names, search_tips)
|
||||
self.ins.search.connect("changed", self.search_entry_changed)
|
||||
self.ins.set_entry("Search packages:")
|
||||
# set the search entry for each table
|
||||
for tab in self.tables:
|
||||
search_tip = "Enter a package name to find it"
|
||||
self.ins.search.set_tooltip_text(search_tip)
|
||||
self.ins.search.props.has_tooltip = True
|
||||
tab.set_search_entry(0, self.ins.search)
|
||||
|
||||
# add all into the dialog
|
||||
self.box_group_area.pack_start(self.ins, expand=True, fill=True)
|
||||
@@ -180,53 +157,17 @@ class PackageSelectionPage (HobPage):
|
||||
self.back_button.connect("clicked", self.back_button_clicked_cb)
|
||||
self.button_box.pack_end(self.back_button, expand=False, fill=False)
|
||||
|
||||
def search_entry_changed(self, entry):
|
||||
text = entry.get_text()
|
||||
if self.ins.search_focus:
|
||||
self.ins.search_focus = False
|
||||
elif self.ins.page_changed:
|
||||
self.ins.page_change = False
|
||||
self.filter_search(entry)
|
||||
elif text not in self.ins.search_names:
|
||||
self.filter_search(entry)
|
||||
|
||||
def filter_search(self, entry):
|
||||
text = entry.get_text()
|
||||
current_tab = self.ins.get_current_page()
|
||||
filter = self.pages[current_tab]['filter']
|
||||
filter[PackageListModel.COL_NAME] = text
|
||||
self.tables[current_tab].set_model(self.package_model.tree_model(filter, search_data=text))
|
||||
if self.package_model.filtered_nb == 0:
|
||||
if not self.ins.get_nth_page(current_tab).top_bar:
|
||||
self.ins.get_nth_page(current_tab).add_no_result_bar(entry)
|
||||
self.ins.get_nth_page(current_tab).top_bar.show()
|
||||
self.ins.get_nth_page(current_tab).scroll.hide()
|
||||
else:
|
||||
if self.ins.get_nth_page(current_tab).top_bar:
|
||||
self.ins.get_nth_page(current_tab).top_bar.hide()
|
||||
self.ins.get_nth_page(current_tab).scroll.show()
|
||||
if entry.get_text() == '':
|
||||
entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
else:
|
||||
entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, True)
|
||||
|
||||
def button_click_cb(self, widget, event):
|
||||
path, col = widget.table_tree.get_cursor()
|
||||
tree_model = widget.table_tree.get_model()
|
||||
if path and col.get_title() != 'Included': # else activation is likely a removal
|
||||
properties = {'binb': '' , 'name': '', 'size':'', 'recipe':'', 'files_list':''}
|
||||
properties['binb'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_BINB)
|
||||
properties['name'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_NAME)
|
||||
properties['size'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_SIZE)
|
||||
properties['recipe'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_RCP)
|
||||
properties['files_list'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_FLIST)
|
||||
|
||||
self.builder.show_recipe_property_dialog(properties)
|
||||
if path: # else activation is likely a removal
|
||||
binb = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_BINB)
|
||||
if binb:
|
||||
self.builder.show_binb_dialog(binb)
|
||||
|
||||
def open_log_clicked_cb(self, button, log_file):
|
||||
if log_file:
|
||||
log_file = "file:///" + log_file
|
||||
gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
|
||||
os.system("xdg-open /%s" % log_file)
|
||||
|
||||
def show_page(self, log_file):
|
||||
children = self.button_box.get_children() or []
|
||||
@@ -243,7 +184,6 @@ class PackageSelectionPage (HobPage):
|
||||
self.show_all()
|
||||
|
||||
def build_image_clicked_cb(self, button):
|
||||
self.builder.parsing_warnings = []
|
||||
self.builder.build_image()
|
||||
|
||||
def back_button_clicked_cb(self, button):
|
||||
@@ -254,7 +194,13 @@ class PackageSelectionPage (HobPage):
|
||||
else:
|
||||
self.builder.show_configuration()
|
||||
|
||||
def _expand_all(self):
|
||||
for tab in self.tables:
|
||||
tab.table_tree.expand_all()
|
||||
|
||||
def refresh_selection(self):
|
||||
self._expand_all()
|
||||
|
||||
self.builder.configuration.selected_packages = self.package_model.get_selected_packages()
|
||||
self.builder.configuration.user_selected_packages = self.package_model.get_user_selected_packages()
|
||||
selected_packages_num = len(self.builder.configuration.selected_packages)
|
||||
@@ -262,8 +208,8 @@ class PackageSelectionPage (HobPage):
|
||||
selected_packages_size_str = HobPage._size_to_string(selected_packages_size)
|
||||
|
||||
image_overhead_factor = self.builder.configuration.image_overhead_factor
|
||||
image_rootfs_size = self.builder.configuration.image_rootfs_size / 1024 # image_rootfs_size is KB
|
||||
image_extra_size = self.builder.configuration.image_extra_size / 1024 # image_extra_size is KB
|
||||
image_rootfs_size = self.builder.configuration.image_rootfs_size * 1024 # image_rootfs_size is KB
|
||||
image_extra_size = self.builder.configuration.image_extra_size * 1024 # image_extra_size is KB
|
||||
base_size = image_overhead_factor * selected_packages_size
|
||||
image_total_size = max(base_size, image_rootfs_size) + image_extra_size
|
||||
if "zypper" in self.builder.configuration.selected_packages:
|
||||
@@ -288,7 +234,6 @@ class PackageSelectionPage (HobPage):
|
||||
self.refresh_selection()
|
||||
if not self.builder.customized:
|
||||
self.builder.customized = True
|
||||
self.builder.configuration.initial_selected_image = self.builder.configuration.selected_image
|
||||
self.builder.configuration.selected_image = self.recipe_model.__custom_image__
|
||||
self.builder.rcppkglist_populated()
|
||||
|
||||
|
||||
@@ -43,11 +43,6 @@ class HobProgressBar (gtk.ProgressBar):
|
||||
text += " %.0f%%" % self.percentage
|
||||
self.set_text(text)
|
||||
|
||||
def set_stop_title(self, text=None):
|
||||
if not text:
|
||||
text = ""
|
||||
self.set_text(text)
|
||||
|
||||
def reset(self):
|
||||
self.set_fraction(0)
|
||||
self.set_text("")
|
||||
|
||||
@@ -33,13 +33,11 @@ from bb.ui.crumbs.hobpages import HobPage
|
||||
class RecipeSelectionPage (HobPage):
|
||||
pages = [
|
||||
{
|
||||
'name' : 'Included recipes',
|
||||
'tooltip' : 'The recipes currently included for your image',
|
||||
'filter' : { RecipeListModel.COL_INC : [True],
|
||||
'name' : 'Included recipes',
|
||||
'tooltip' : 'The recipes currently included for your image',
|
||||
'filter' : { RecipeListModel.COL_INC : [True],
|
||||
RecipeListModel.COL_TYPE : ['recipe', 'packagegroup'] },
|
||||
'search' : 'Search recipes by name',
|
||||
'searchtip' : 'Enter a recipe name to find it',
|
||||
'columns' : [{
|
||||
'columns' : [{
|
||||
'col_name' : 'Recipe name',
|
||||
'col_id' : RecipeListModel.COL_NAME,
|
||||
'col_style': 'text',
|
||||
@@ -54,7 +52,7 @@ class RecipeSelectionPage (HobPage):
|
||||
'col_max' : 300,
|
||||
'expand' : 'True'
|
||||
}, {
|
||||
'col_name' : 'Brought in by (+others)',
|
||||
'col_name' : 'Brought in by',
|
||||
'col_id' : RecipeListModel.COL_BINB,
|
||||
'col_style': 'binb',
|
||||
'col_min' : 100,
|
||||
@@ -68,12 +66,10 @@ class RecipeSelectionPage (HobPage):
|
||||
'col_max' : 100
|
||||
}]
|
||||
}, {
|
||||
'name' : 'All recipes',
|
||||
'tooltip' : 'All recipes in your configured layers',
|
||||
'filter' : { RecipeListModel.COL_TYPE : ['recipe'] },
|
||||
'search' : 'Search recipes by name',
|
||||
'searchtip' : 'Enter a recipe name to find it',
|
||||
'columns' : [{
|
||||
'name' : 'All recipes',
|
||||
'tooltip' : 'All recipes in your configured layers',
|
||||
'filter' : { RecipeListModel.COL_TYPE : ['recipe'] },
|
||||
'columns' : [{
|
||||
'col_name' : 'Recipe name',
|
||||
'col_id' : RecipeListModel.COL_NAME,
|
||||
'col_style': 'text',
|
||||
@@ -102,12 +98,10 @@ class RecipeSelectionPage (HobPage):
|
||||
'col_max' : 100
|
||||
}]
|
||||
}, {
|
||||
'name' : 'Package Groups',
|
||||
'tooltip' : 'All package groups in your configured layers',
|
||||
'filter' : { RecipeListModel.COL_TYPE : ['packagegroup'] },
|
||||
'search' : 'Search package groups by name',
|
||||
'searchtip' : 'Enter a package group name to find it',
|
||||
'columns' : [{
|
||||
'name' : 'Package Groups',
|
||||
'tooltip' : 'All package groups in your configured layers',
|
||||
'filter' : { RecipeListModel.COL_TYPE : ['packagegroup'] },
|
||||
'columns' : [{
|
||||
'col_name' : 'Package group name',
|
||||
'col_id' : RecipeListModel.COL_NAME,
|
||||
'col_style': 'text',
|
||||
@@ -148,34 +142,26 @@ class RecipeSelectionPage (HobPage):
|
||||
# set visible members
|
||||
self.ins = HobNotebook()
|
||||
self.tables = [] # we need modify table when the dialog is shown
|
||||
|
||||
search_names = []
|
||||
search_tips = []
|
||||
# append the tabs in order
|
||||
for page in self.pages:
|
||||
columns = page['columns']
|
||||
name = page['name']
|
||||
tab = HobViewTable(columns, name)
|
||||
search_names.append(page['search'])
|
||||
search_tips.append(page['searchtip'])
|
||||
tab = HobViewTable(columns)
|
||||
filter = page['filter']
|
||||
sort_model = self.recipe_model.tree_model(filter, initial=True)
|
||||
tab.set_model(sort_model)
|
||||
tab.connect("toggled", self.table_toggled_cb, name)
|
||||
if name == "Included recipes":
|
||||
tab.set_model(self.recipe_model.tree_model(filter))
|
||||
tab.connect("toggled", self.table_toggled_cb, page['name'])
|
||||
if page['name'] == "Included recipes":
|
||||
tab.connect("button-release-event", self.button_click_cb)
|
||||
tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include)
|
||||
if name == "Package Groups":
|
||||
tab.connect("button-release-event", self.button_click_cb)
|
||||
tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include)
|
||||
if name == "All recipes":
|
||||
tab.connect("button-release-event", self.button_click_cb)
|
||||
tab.connect("cell-fadeinout-stopped", self.button_click_cb)
|
||||
self.ins.append_page(tab, page['name'], page['tooltip'])
|
||||
self.tables.append(tab)
|
||||
|
||||
self.ins.set_entry(search_names, search_tips)
|
||||
self.ins.search.connect("changed", self.search_entry_changed)
|
||||
self.ins.set_entry("Search recipes:")
|
||||
# set the search entry for each table
|
||||
for tab in self.tables:
|
||||
search_tip = "Enter a recipe's or task's name to find it"
|
||||
self.ins.search.set_tooltip_text(search_tip)
|
||||
self.ins.search.props.has_tooltip = True
|
||||
tab.set_search_entry(0, self.ins.search)
|
||||
|
||||
# add all into the window
|
||||
self.box_group_area.pack_start(self.ins, expand=True, fill=True)
|
||||
@@ -195,60 +181,18 @@ class RecipeSelectionPage (HobPage):
|
||||
self.back_button.connect("clicked", self.back_button_clicked_cb)
|
||||
button_box.pack_end(self.back_button, expand=False, fill=False)
|
||||
|
||||
def search_entry_changed(self, entry):
|
||||
text = entry.get_text()
|
||||
if self.ins.search_focus:
|
||||
self.ins.search_focus = False
|
||||
elif self.ins.page_changed:
|
||||
self.ins.page_change = False
|
||||
self.filter_search(entry)
|
||||
elif text not in self.ins.search_names:
|
||||
self.filter_search(entry)
|
||||
|
||||
def filter_search(self, entry):
|
||||
text = entry.get_text()
|
||||
current_tab = self.ins.get_current_page()
|
||||
filter = self.pages[current_tab]['filter']
|
||||
filter[RecipeListModel.COL_NAME] = text
|
||||
self.tables[current_tab].set_model(self.recipe_model.tree_model(filter, search_data=text))
|
||||
if self.recipe_model.filtered_nb == 0:
|
||||
if not self.ins.get_nth_page(current_tab).top_bar:
|
||||
self.ins.get_nth_page(current_tab).add_no_result_bar(entry)
|
||||
self.ins.get_nth_page(current_tab).top_bar.show()
|
||||
self.ins.get_nth_page(current_tab).scroll.hide()
|
||||
else:
|
||||
if self.ins.get_nth_page(current_tab).top_bar:
|
||||
self.ins.get_nth_page(current_tab).top_bar.hide()
|
||||
self.ins.get_nth_page(current_tab).scroll.show()
|
||||
if entry.get_text() == '':
|
||||
entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
|
||||
else:
|
||||
entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, True)
|
||||
|
||||
def button_click_cb(self, widget, event):
|
||||
path, col = widget.table_tree.get_cursor()
|
||||
tree_model = widget.table_tree.get_model()
|
||||
if path and col.get_title() != 'Included': # else activation is likely a removal
|
||||
properties = {'summary': '', 'name': '', 'version': '', 'revision': '', 'binb': '', 'group': '', 'license': '', 'homepage': '', 'bugtracker': '', 'description': ''}
|
||||
properties['summary'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_SUMMARY)
|
||||
properties['name'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_NAME)
|
||||
properties['version'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_VERSION)
|
||||
properties['revision'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_REVISION)
|
||||
properties['binb'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_BINB)
|
||||
properties['group'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_GROUP)
|
||||
properties['license'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_LIC)
|
||||
properties['homepage'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_HOMEPAGE)
|
||||
properties['bugtracker'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_BUGTRACKER)
|
||||
properties['description'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_DESC)
|
||||
self.builder.show_recipe_property_dialog(properties)
|
||||
if path: # else activation is likely a removal
|
||||
binb = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_BINB)
|
||||
if binb:
|
||||
self.builder.show_binb_dialog(binb)
|
||||
|
||||
def build_packages_clicked_cb(self, button):
|
||||
self.builder.build_packages()
|
||||
|
||||
def back_button_clicked_cb(self, button):
|
||||
self.builder.recipe_model.set_selected_image(self.builder.configuration.initial_selected_image)
|
||||
self.builder.image_configuration_page.update_image_combo(self.builder.recipe_model, self.builder.configuration.initial_selected_image)
|
||||
self.builder.image_configuration_page.update_image_desc()
|
||||
self.builder.show_configuration()
|
||||
|
||||
def refresh_selection(self):
|
||||
|
||||
@@ -46,7 +46,7 @@ class RunningBuildModel (gtk.TreeStore):
|
||||
color = model.get(it, self.COL_COLOR)[0]
|
||||
if not color:
|
||||
return False
|
||||
if color == HobColors.ERROR or color == HobColors.WARNING:
|
||||
if color == HobColors.ERROR:
|
||||
return True
|
||||
return False
|
||||
|
||||
@@ -76,7 +76,7 @@ class RunningBuild (gobject.GObject):
|
||||
'build-complete' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
'build-aborted' : (gobject.SIGNAL_RUN_LAST,
|
||||
'build-aborted' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
'task-started' : (gobject.SIGNAL_RUN_LAST,
|
||||
@@ -85,12 +85,6 @@ class RunningBuild (gobject.GObject):
|
||||
'log-error' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
'log-warning' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
'disk-full' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
()),
|
||||
'no-provider' : (gobject.SIGNAL_RUN_LAST,
|
||||
gobject.TYPE_NONE,
|
||||
(gobject.TYPE_PYOBJECT,)),
|
||||
@@ -154,7 +148,6 @@ class RunningBuild (gobject.GObject):
|
||||
elif event.levelno >= logging.WARNING:
|
||||
icon = "dialog-warning"
|
||||
color = HobColors.WARNING
|
||||
self.emit("log-warning")
|
||||
else:
|
||||
icon = None
|
||||
color = HobColors.OK
|
||||
@@ -293,7 +286,6 @@ class RunningBuild (gobject.GObject):
|
||||
# Emit the appropriate signal depending on the number of failures
|
||||
if self.buildaborted:
|
||||
self.emit ("build-aborted")
|
||||
self.buildaborted = False
|
||||
elif (failures >= 1):
|
||||
self.emit ("build-failed")
|
||||
else:
|
||||
@@ -308,7 +300,6 @@ class RunningBuild (gobject.GObject):
|
||||
|
||||
elif isinstance(event, bb.event.DiskFull):
|
||||
self.buildaborted = True
|
||||
self.emit("disk-full")
|
||||
|
||||
elif isinstance(event, bb.command.CommandFailed):
|
||||
self.emit("log", "error", "Command execution failed: %s" % (event.error))
|
||||
|
||||
@@ -137,6 +137,8 @@ class RecipeFile(ConfigFile):
|
||||
|
||||
class TemplateMgr(gobject.GObject):
|
||||
|
||||
__gLocalVars__ = ["MACHINE", "PACKAGE_CLASSES", "DISTRO", "DL_DIR", "SSTATE_DIR", "SSTATE_MIRRORS", "PARALLEL_MAKE", "BB_NUMBER_THREADS", "CONF_VERSION"]
|
||||
__gBBLayersVars__ = ["BBLAYERS", "LCONF_VERSION"]
|
||||
__gRecipeVars__ = ["DEPENDS", "IMAGE_INSTALL"]
|
||||
|
||||
def __init__(self):
|
||||
@@ -150,21 +152,37 @@ class TemplateMgr(gobject.GObject):
|
||||
def convert_to_template_pathfilename(cls, filename, path):
|
||||
return "%s/%s%s%s" % (path, "template-", filename, ".hob")
|
||||
|
||||
@classmethod
|
||||
def convert_to_bblayers_pathfilename(cls, filename, path):
|
||||
return "%s/%s%s%s" % (path, "bblayers-", filename, ".conf")
|
||||
|
||||
@classmethod
|
||||
def convert_to_local_pathfilename(cls, filename, path):
|
||||
return "%s/%s%s%s" % (path, "local-", filename, ".conf")
|
||||
|
||||
@classmethod
|
||||
def convert_to_image_pathfilename(cls, filename, path):
|
||||
return "%s/%s%s%s" % (path, "hob-image-", filename, ".bb")
|
||||
|
||||
def open(self, filename, path):
|
||||
self.template_hob = HobTemplateFile(TemplateMgr.convert_to_template_pathfilename(filename, path))
|
||||
self.bblayers_conf = ConfigFile(TemplateMgr.convert_to_bblayers_pathfilename(filename, path))
|
||||
self.local_conf = ConfigFile(TemplateMgr.convert_to_local_pathfilename(filename, path))
|
||||
self.image_bb = RecipeFile(TemplateMgr.convert_to_image_pathfilename(filename, path))
|
||||
|
||||
def setVar(self, var, val):
|
||||
if var in TemplateMgr.__gLocalVars__:
|
||||
self.local_conf.setVar(var, val)
|
||||
if var in TemplateMgr.__gBBLayersVars__:
|
||||
self.bblayers_conf.setVar(var, val)
|
||||
if var in TemplateMgr.__gRecipeVars__:
|
||||
self.image_bb.setVar(var, val)
|
||||
|
||||
self.template_hob.setVar(var, val)
|
||||
|
||||
def save(self):
|
||||
self.local_conf.save()
|
||||
self.bblayers_conf.save()
|
||||
self.image_bb.save()
|
||||
self.template_hob.save()
|
||||
|
||||
@@ -182,6 +200,12 @@ class TemplateMgr(gobject.GObject):
|
||||
if self.template_hob:
|
||||
del self.template_hob
|
||||
template_hob = None
|
||||
if self.bblayers_conf:
|
||||
del self.bblayers_conf
|
||||
self.bblayers_conf = None
|
||||
if self.local_conf:
|
||||
del self.local_conf
|
||||
self.local_conf = None
|
||||
if self.image_bb:
|
||||
del self.image_bb
|
||||
self.image_bb = None
|
||||
|
||||
@@ -198,23 +198,17 @@ class gtkthread(threading.Thread):
|
||||
|
||||
def main(server, eventHandler):
|
||||
try:
|
||||
cmdline, error = server.runCommand(["getCmdLineAction"])
|
||||
if error:
|
||||
print("Error getting bitbake commandline: %s" % error)
|
||||
return 1
|
||||
elif not cmdline:
|
||||
print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
|
||||
return 1
|
||||
elif not cmdline or cmdline[0] != "generateDotGraph":
|
||||
cmdline = server.runCommand(["getCmdLineAction"])
|
||||
if cmdline and not cmdline['action']:
|
||||
print(cmdline['msg'])
|
||||
return
|
||||
elif not cmdline or (cmdline['action'] and cmdline['action'][0] != "generateDotGraph"):
|
||||
print("This UI is only compatible with the -g option")
|
||||
return 1
|
||||
ret, error = server.runCommand(["generateDepTreeEvent", cmdline[1], cmdline[2]])
|
||||
if error:
|
||||
print("Error running command '%s': %s" % (cmdline, error))
|
||||
return 1
|
||||
elif ret != True:
|
||||
print("Error running command '%s': returned %s" % (cmdline, ret))
|
||||
return 1
|
||||
return
|
||||
ret = server.runCommand(["generateDepTreeEvent", cmdline['action'][1], cmdline['action'][2]])
|
||||
if ret != True:
|
||||
print("Couldn't run command! %s" % ret)
|
||||
return
|
||||
except xmlrpclib.Fault as x:
|
||||
print("XMLRPC Fault getting commandline:\n %s" % x)
|
||||
return
|
||||
@@ -240,9 +234,7 @@ def main(server, eventHandler):
|
||||
try:
|
||||
event = eventHandler.waitEvent(0.25)
|
||||
if gtkthread.quit.isSet():
|
||||
_, error = server.runCommand(["stateStop"])
|
||||
if error:
|
||||
print('Unable to cleanly stop: %s' % error)
|
||||
server.runCommand(["stateStop"])
|
||||
break
|
||||
|
||||
if event is None:
|
||||
@@ -318,13 +310,9 @@ def main(server, eventHandler):
|
||||
break
|
||||
if shutdown == 1:
|
||||
print("\nSecond Keyboard Interrupt, stopping...\n")
|
||||
_, error = server.runCommand(["stateStop"])
|
||||
if error:
|
||||
print('Unable to cleanly stop: %s' % error)
|
||||
server.runCommand(["stateStop"])
|
||||
if shutdown == 0:
|
||||
print("\nKeyboard Interrupt, closing down...\n")
|
||||
_, error = server.runCommand(["stateShutdown"])
|
||||
if error:
|
||||
print('Unable to cleanly shutdown: %s' % error)
|
||||
server.runCommand(["stateShutdown"])
|
||||
shutdown = shutdown + 1
|
||||
pass
|
||||
|
||||
@@ -80,19 +80,16 @@ def main (server, eventHandler):
|
||||
running_build.connect ("build-failed", running_build_failed_cb)
|
||||
|
||||
try:
|
||||
cmdline, error = server.runCommand(["getCmdLineAction"])
|
||||
if error:
|
||||
print("Error getting bitbake commandline: %s" % error)
|
||||
return 1
|
||||
elif not cmdline:
|
||||
cmdline = server.runCommand(["getCmdLineAction"])
|
||||
if not cmdline:
|
||||
print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
|
||||
return 1
|
||||
ret, error = server.runCommand(cmdline)
|
||||
if error:
|
||||
print("Error running command '%s': %s" % (cmdline, error))
|
||||
elif not cmdline['action']:
|
||||
print(cmdline['msg'])
|
||||
return 1
|
||||
elif ret != True:
|
||||
print("Error running command '%s': returned %s" % (cmdline, ret))
|
||||
ret = server.runCommand(cmdline['action'])
|
||||
if ret != True:
|
||||
print("Couldn't get default commandline! %s" % ret)
|
||||
return 1
|
||||
except xmlrpclib.Fault as x:
|
||||
print("XMLRPC Fault getting commandline:\n %s" % x)
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
|
||||
import sys
|
||||
import os
|
||||
requirements = "FATAL: Hob requires Gtk+ 2.20.0 or higher, PyGtk 2.21.0 or higher"
|
||||
requirements = "FATAL: Gtk+, PyGtk and PyGobject are required to use Hob"
|
||||
try:
|
||||
import gobject
|
||||
import gtk
|
||||
|
||||
@@ -146,7 +146,7 @@ class TerminalFilter(object):
|
||||
import curses
|
||||
except ImportError:
|
||||
sys.exit("FATAL: The knotty ui could not load the required curses python module.")
|
||||
|
||||
|
||||
import termios
|
||||
self.curses = curses
|
||||
self.termios = termios
|
||||
@@ -157,8 +157,6 @@ class TerminalFilter(object):
|
||||
new[3] = new[3] & ~termios.ECHO
|
||||
termios.tcsetattr(fd, termios.TCSADRAIN, new)
|
||||
curses.setupterm()
|
||||
if curses.tigetnum("colors") > 2:
|
||||
format.enable_color()
|
||||
self.ed = curses.tigetstr("ed")
|
||||
if self.ed:
|
||||
self.cuu = curses.tigetstr("cuu")
|
||||
@@ -189,7 +187,7 @@ class TerminalFilter(object):
|
||||
return
|
||||
if self.footer_present:
|
||||
self.clearFooter()
|
||||
if (not self.helper.tasknumber_total or self.helper.tasknumber_current == self.helper.tasknumber_total) and not len(activetasks):
|
||||
if not self.helper.tasknumber_total or self.helper.tasknumber_current == self.helper.tasknumber_total:
|
||||
return
|
||||
tasks = []
|
||||
for t in runningpids:
|
||||
@@ -219,19 +217,9 @@ class TerminalFilter(object):
|
||||
def main(server, eventHandler, tf = TerminalFilter):
|
||||
|
||||
# Get values of variables which control our output
|
||||
includelogs, error = server.runCommand(["getVariable", "BBINCLUDELOGS"])
|
||||
if error:
|
||||
logger.error("Unable to get the value of BBINCLUDELOGS variable: %s" % error)
|
||||
return 1
|
||||
loglines, error = server.runCommand(["getVariable", "BBINCLUDELOGS_LINES"])
|
||||
if error:
|
||||
logger.error("Unable to get the value of BBINCLUDELOGS_LINES variable: %s" % error)
|
||||
return 1
|
||||
consolelogfile, error = server.runCommand(["getVariable", "BB_CONSOLELOG"])
|
||||
if error:
|
||||
logger.error("Unable to get the value of BB_CONSOLELOG variable: %s" % error)
|
||||
return 1
|
||||
|
||||
includelogs = server.runCommand(["getVariable", "BBINCLUDELOGS"])
|
||||
loglines = server.runCommand(["getVariable", "BBINCLUDELOGS_LINES"])
|
||||
consolelogfile = server.runCommand(["getVariable", "BB_CONSOLELOG"])
|
||||
if sys.stdin.isatty() and sys.stdout.isatty():
|
||||
log_exec_tty = True
|
||||
else:
|
||||
@@ -240,36 +228,31 @@ def main(server, eventHandler, tf = TerminalFilter):
|
||||
helper = uihelper.BBUIHelper()
|
||||
|
||||
console = logging.StreamHandler(sys.stdout)
|
||||
format_str = "%(levelname)s: %(message)s"
|
||||
format = bb.msg.BBLogFormatter(format_str)
|
||||
format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
|
||||
bb.msg.addDefaultlogFilter(console)
|
||||
console.setFormatter(format)
|
||||
logger.addHandler(console)
|
||||
if consolelogfile:
|
||||
bb.utils.mkdirhier(os.path.dirname(consolelogfile))
|
||||
conlogformat = bb.msg.BBLogFormatter(format_str)
|
||||
consolelog = logging.FileHandler(consolelogfile)
|
||||
bb.msg.addDefaultlogFilter(consolelog)
|
||||
consolelog.setFormatter(conlogformat)
|
||||
consolelog.setFormatter(format)
|
||||
logger.addHandler(consolelog)
|
||||
|
||||
try:
|
||||
cmdline, error = server.runCommand(["getCmdLineAction"])
|
||||
if error:
|
||||
logger.error("Unable to get bitbake commandline arguments: %s" % error)
|
||||
return 1
|
||||
elif not cmdline:
|
||||
cmdline = server.runCommand(["getCmdLineAction"])
|
||||
if not cmdline:
|
||||
print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
|
||||
return 1
|
||||
ret, error = server.runCommand(cmdline)
|
||||
if error:
|
||||
logger.error("Command '%s' failed: %s" % (cmdline, error))
|
||||
elif not cmdline['action']:
|
||||
print(cmdline['msg'])
|
||||
return 1
|
||||
elif ret != True:
|
||||
logger.error("Command '%s' failed: returned %s" % (cmdline, ret))
|
||||
ret = server.runCommand(cmdline['action'])
|
||||
if ret != True:
|
||||
print("Couldn't get default commandline! %s" % ret)
|
||||
return 1
|
||||
except xmlrpclib.Fault as x:
|
||||
logger.error("XMLRPC Fault getting commandline:\n %s" % x)
|
||||
print("XMLRPC Fault getting commandline:\n %s" % x)
|
||||
return 1
|
||||
|
||||
parseprogress = None
|
||||
@@ -329,7 +312,7 @@ def main(server, eventHandler, tf = TerminalFilter):
|
||||
logfile = event.logfile
|
||||
if logfile and os.path.exists(logfile):
|
||||
termfilter.clearFooter()
|
||||
bb.error("Logfile of failure stored in: %s" % logfile)
|
||||
print("ERROR: Logfile of failure stored in: %s" % logfile)
|
||||
if includelogs and not event.errprinted:
|
||||
print("Log data follows:")
|
||||
f = open(logfile, "r")
|
||||
@@ -453,8 +436,7 @@ def main(server, eventHandler, tf = TerminalFilter):
|
||||
bb.runqueue.runQueueExitWait,
|
||||
bb.event.OperationStarted,
|
||||
bb.event.OperationCompleted,
|
||||
bb.event.OperationProgress,
|
||||
bb.event.DiskFull)):
|
||||
bb.event.OperationProgress)):
|
||||
continue
|
||||
|
||||
logger.error("Unknown event: %s", event)
|
||||
@@ -468,15 +450,11 @@ def main(server, eventHandler, tf = TerminalFilter):
|
||||
termfilter.clearFooter()
|
||||
if main.shutdown == 1:
|
||||
print("\nSecond Keyboard Interrupt, stopping...\n")
|
||||
_, error = server.runCommand(["stateStop"])
|
||||
if error:
|
||||
logger.error("Unable to cleanly stop: %s" % error)
|
||||
server.runCommand(["stateStop"])
|
||||
if main.shutdown == 0:
|
||||
print("\nKeyboard Interrupt, closing down...\n")
|
||||
interrupted = True
|
||||
_, error = server.runCommand(["stateShutdown"])
|
||||
if error:
|
||||
logger.error("Unable to cleanly shutdown: %s" % error)
|
||||
print("\nKeyboard Interrupt, closing down...\n")
|
||||
server.runCommand(["stateShutdown"])
|
||||
main.shutdown = main.shutdown + 1
|
||||
pass
|
||||
|
||||
|
||||
@@ -236,18 +236,15 @@ class NCursesUI:
|
||||
shutdown = 0
|
||||
|
||||
try:
|
||||
cmdline, error = server.runCommand(["getCmdLineAction"])
|
||||
cmdline = server.runCommand(["getCmdLineAction"])
|
||||
if not cmdline:
|
||||
print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
|
||||
return
|
||||
elif error:
|
||||
print("Error getting bitbake commandline: %s" % error)
|
||||
elif not cmdline['action']:
|
||||
print(cmdline['msg'])
|
||||
return
|
||||
ret, error = server.runCommand(cmdline)
|
||||
if error:
|
||||
print("Error running command '%s': %s" % (cmdline, error))
|
||||
return
|
||||
elif ret != True:
|
||||
ret = server.runCommand(cmdline['action'])
|
||||
if ret != True:
|
||||
print("Couldn't get default commandlind! %s" % ret)
|
||||
return
|
||||
except xmlrpclib.Fault as x:
|
||||
@@ -348,14 +345,10 @@ class NCursesUI:
|
||||
exitflag = True
|
||||
if shutdown == 1:
|
||||
mw.appendText("Second Keyboard Interrupt, stopping...\n")
|
||||
_, error = server.runCommand(["stateStop"])
|
||||
if error:
|
||||
print("Unable to cleanly stop: %s" % error)
|
||||
server.runCommand(["stateStop"])
|
||||
if shutdown == 0:
|
||||
mw.appendText("Keyboard Interrupt, closing down...\n")
|
||||
_, error = server.runCommand(["stateShutdown"])
|
||||
if error:
|
||||
print("Unable to cleanly shutdown: %s" % error)
|
||||
server.runCommand(["stateShutdown"])
|
||||
shutdown = shutdown + 1
|
||||
pass
|
||||
|
||||
|
||||
@@ -37,17 +37,6 @@ class BBUIEventQueue:
|
||||
self.BBServer = BBServer
|
||||
self.clientinfo = clientinfo
|
||||
|
||||
server = UIXMLRPCServer(self.clientinfo)
|
||||
self.host, self.port = server.socket.getsockname()
|
||||
|
||||
server.register_function( self.system_quit, "event.quit" )
|
||||
server.register_function( self.send_event, "event.sendpickle" )
|
||||
server.socket.settimeout(1)
|
||||
|
||||
self.EventHandle = self.BBServer.registerEventHandler(self.host, self.port)
|
||||
|
||||
self.server = server
|
||||
|
||||
self.t = threading.Thread()
|
||||
self.t.setDaemon(True)
|
||||
self.t.run = self.startCallbackHandler
|
||||
@@ -84,9 +73,19 @@ class BBUIEventQueue:
|
||||
|
||||
def startCallbackHandler(self):
|
||||
|
||||
while not self.server.quit:
|
||||
self.server.handle_request()
|
||||
self.server.server_close()
|
||||
server = UIXMLRPCServer(self.clientinfo)
|
||||
self.host, self.port = server.socket.getsockname()
|
||||
|
||||
server.register_function( self.system_quit, "event.quit" )
|
||||
server.register_function( self.send_event, "event.sendpickle" )
|
||||
server.socket.settimeout(1)
|
||||
|
||||
self.EventHandle = self.BBServer.registerEventHandler(self.host, self.port)
|
||||
|
||||
self.server = server
|
||||
while not server.quit:
|
||||
server.handle_request()
|
||||
server.server_close()
|
||||
|
||||
def system_quit( self ):
|
||||
"""
|
||||
|
||||
@@ -51,7 +51,6 @@ class BBUIHelper:
|
||||
if isinstance(event, bb.runqueue.runQueueTaskStarted) or isinstance(event, bb.runqueue.sceneQueueTaskStarted):
|
||||
self.tasknumber_current = event.stats.completed + event.stats.active + event.stats.failed + 1
|
||||
self.tasknumber_total = event.stats.total
|
||||
self.needUpdate = True
|
||||
|
||||
def getTasks(self):
|
||||
self.needUpdate = False
|
||||
|
||||
@@ -458,6 +458,27 @@ def preserved_envvars_exported():
|
||||
'USER',
|
||||
]
|
||||
|
||||
def preserved_envvars_exported_interactive():
|
||||
"""Variables which are taken from the environment and placed in and exported
|
||||
from the metadata, for interactive tasks"""
|
||||
return [
|
||||
'COLORTERM',
|
||||
'DBUS_SESSION_BUS_ADDRESS',
|
||||
'DESKTOP_SESSION',
|
||||
'DESKTOP_STARTUP_ID',
|
||||
'DISPLAY',
|
||||
'GNOME_KEYRING_PID',
|
||||
'GNOME_KEYRING_SOCKET',
|
||||
'GPG_AGENT_INFO',
|
||||
'GTK_RC_FILES',
|
||||
'SESSION_MANAGER',
|
||||
'KRB5CCNAME',
|
||||
'SSH_AUTH_SOCK',
|
||||
'XAUTHORITY',
|
||||
'XDG_DATA_DIRS',
|
||||
'XDG_SESSION_COOKIE',
|
||||
]
|
||||
|
||||
def preserved_envvars():
|
||||
"""Variables which are taken from the environment and placed in the metadata"""
|
||||
v = [
|
||||
@@ -466,7 +487,7 @@ def preserved_envvars():
|
||||
'BB_ENV_WHITELIST',
|
||||
'BB_ENV_EXTRAWHITE',
|
||||
]
|
||||
return v + preserved_envvars_exported()
|
||||
return v + preserved_envvars_exported() + preserved_envvars_exported_interactive()
|
||||
|
||||
def filter_environment(good_vars):
|
||||
"""
|
||||
@@ -474,20 +495,24 @@ def filter_environment(good_vars):
|
||||
are not known and may influence the build in a negative way.
|
||||
"""
|
||||
|
||||
removed_vars = {}
|
||||
removed_vars = []
|
||||
for key in os.environ.keys():
|
||||
if key in good_vars:
|
||||
continue
|
||||
|
||||
removed_vars[key] = os.environ[key]
|
||||
removed_vars.append(key)
|
||||
os.unsetenv(key)
|
||||
del os.environ[key]
|
||||
|
||||
if len(removed_vars):
|
||||
logger.debug(1, "Removed the following variables from the environment: %s", ", ".join(removed_vars.keys()))
|
||||
logger.debug(1, "Removed the following variables from the environment: %s", ", ".join(removed_vars))
|
||||
|
||||
return removed_vars
|
||||
|
||||
def create_interactive_env(d):
|
||||
for k in preserved_envvars_exported_interactive():
|
||||
os.setenv(k, d.getVar(k, True))
|
||||
|
||||
def approved_variables():
|
||||
"""
|
||||
Determine and return the list of whitelisted variables which are approved
|
||||
@@ -496,13 +521,10 @@ def approved_variables():
|
||||
approved = []
|
||||
if 'BB_ENV_WHITELIST' in os.environ:
|
||||
approved = os.environ['BB_ENV_WHITELIST'].split()
|
||||
approved.extend(['BB_ENV_WHITELIST'])
|
||||
else:
|
||||
approved = preserved_envvars()
|
||||
if 'BB_ENV_EXTRAWHITE' in os.environ:
|
||||
approved.extend(os.environ['BB_ENV_EXTRAWHITE'].split())
|
||||
if 'BB_ENV_EXTRAWHITE' not in approved:
|
||||
approved.extend(['BB_ENV_EXTRAWHITE'])
|
||||
return approved
|
||||
|
||||
def clean_environment():
|
||||
@@ -512,9 +534,7 @@ def clean_environment():
|
||||
"""
|
||||
if 'BB_PRESERVE_ENV' not in os.environ:
|
||||
good_vars = approved_variables()
|
||||
return filter_environment(good_vars)
|
||||
|
||||
return {}
|
||||
filter_environment(good_vars)
|
||||
|
||||
def empty_environment():
|
||||
"""
|
||||
@@ -538,17 +558,14 @@ def remove(path, recurse=False):
|
||||
"""Equivalent to rm -f or rm -rf"""
|
||||
if not path:
|
||||
return
|
||||
if recurse:
|
||||
import subprocess, glob
|
||||
# shutil.rmtree(name) would be ideal but its too slow
|
||||
subprocess.call(['rm', '-rf'] + glob.glob(path))
|
||||
return
|
||||
import os, errno, glob
|
||||
import os, errno, shutil, glob
|
||||
for name in glob.glob(path):
|
||||
try:
|
||||
os.unlink(name)
|
||||
except OSError as exc:
|
||||
if exc.errno != errno.ENOENT:
|
||||
if recurse and exc.errno == errno.EISDIR:
|
||||
shutil.rmtree(name)
|
||||
elif exc.errno != errno.ENOENT:
|
||||
raise
|
||||
|
||||
def prunedir(topdir):
|
||||
@@ -803,31 +820,3 @@ def cpu_count():
|
||||
def nonblockingfd(fd):
|
||||
fcntl.fcntl(fd, fcntl.F_SETFL, fcntl.fcntl(fd, fcntl.F_GETFL) | os.O_NONBLOCK)
|
||||
|
||||
def process_profilelog(fn):
|
||||
# Redirect stdout to capture profile information
|
||||
pout = open(fn + '.processed', 'w')
|
||||
so = sys.stdout.fileno()
|
||||
orig_so = os.dup(sys.stdout.fileno())
|
||||
os.dup2(pout.fileno(), so)
|
||||
|
||||
import pstats
|
||||
p = pstats.Stats(fn)
|
||||
p.sort_stats('time')
|
||||
p.print_stats()
|
||||
p.print_callers()
|
||||
p.sort_stats('cumulative')
|
||||
p.print_stats()
|
||||
|
||||
os.dup2(orig_so, so)
|
||||
pout.flush()
|
||||
pout.close()
|
||||
|
||||
#
|
||||
# Work around multiprocessing pool bugs in python < 2.7.3
|
||||
#
|
||||
def multiprocessingpool(*args, **kwargs):
|
||||
if sys.version_info < (2, 7, 3):
|
||||
return bb.compat.Pool(*args, **kwargs)
|
||||
else:
|
||||
return multiprocessing.pool.Pool(*args, **kwargs)
|
||||
|
||||
|
||||
@@ -86,8 +86,7 @@ class PRServer(SimpleXMLRPCServer):
|
||||
def work_forever(self,):
|
||||
self.quit = False
|
||||
self.timeout = 0.5
|
||||
|
||||
logger.info("Started PRServer with DBfile: %s, IP: %s, PORT: %s, PID: %s" %
|
||||
logger.info("PRServer: started! DBfile: %s, IP: %s, PORT: %s, PID: %s" %
|
||||
(self.dbfile, self.host, self.port, str(os.getpid())))
|
||||
|
||||
while not self.quit:
|
||||
@@ -98,10 +97,16 @@ class PRServer(SimpleXMLRPCServer):
|
||||
return
|
||||
|
||||
def start(self):
|
||||
pid = self.daemonize()
|
||||
# Ensure both the parent sees this and the child from the work_forever log entry above
|
||||
logger.info("Started PRServer with DBfile: %s, IP: %s, PORT: %s, PID: %s" %
|
||||
(self.dbfile, self.host, self.port, str(pid)))
|
||||
if self.daemon is True:
|
||||
logger.info("PRServer: try to start daemon...")
|
||||
self.daemonize()
|
||||
else:
|
||||
atexit.register(self.delpid)
|
||||
pid = str(os.getpid())
|
||||
pf = file(self.pidfile, 'w+')
|
||||
pf.write("%s\n" % pid)
|
||||
pf.close()
|
||||
self.work_forever()
|
||||
|
||||
def delpid(self):
|
||||
os.remove(self.pidfile)
|
||||
@@ -113,9 +118,8 @@ class PRServer(SimpleXMLRPCServer):
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0:
|
||||
os.waitpid(pid, 0)
|
||||
#parent return instead of exit to give control
|
||||
return pid
|
||||
return
|
||||
except OSError as e:
|
||||
raise Exception("%s [%d]" % (e.strerror, e.errno))
|
||||
|
||||
@@ -127,7 +131,7 @@ class PRServer(SimpleXMLRPCServer):
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0: #parent
|
||||
os._exit(0)
|
||||
sys.exit(0)
|
||||
except OSError as e:
|
||||
raise Exception("%s [%d]" % (e.strerror, e.errno))
|
||||
|
||||
@@ -143,22 +147,15 @@ class PRServer(SimpleXMLRPCServer):
|
||||
os.dup2(so.fileno(),sys.stdout.fileno())
|
||||
os.dup2(se.fileno(),sys.stderr.fileno())
|
||||
|
||||
# Ensure logging makes it to the logfile
|
||||
streamhandler = logging.StreamHandler()
|
||||
streamhandler.setLevel(logging.DEBUG)
|
||||
formatter = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
|
||||
streamhandler.setFormatter(formatter)
|
||||
logger.addHandler(streamhandler)
|
||||
|
||||
# write pidfile
|
||||
atexit.register(self.delpid)
|
||||
pid = str(os.getpid())
|
||||
pf = file(self.pidfile, 'w')
|
||||
pf.write("%s\n" % pid)
|
||||
pf.close()
|
||||
|
||||
self.work_forever()
|
||||
self.delpid
|
||||
os._exit(0)
|
||||
sys.exit(0)
|
||||
|
||||
class PRServSingleton():
|
||||
def __init__(self, dbfile, logfile, interface):
|
||||
@@ -167,14 +164,21 @@ class PRServSingleton():
|
||||
self.interface = interface
|
||||
self.host = None
|
||||
self.port = None
|
||||
self.event = threading.Event()
|
||||
|
||||
def start(self):
|
||||
self.prserv = PRServer(self.dbfile, self.logfile, self.interface)
|
||||
self.prserv.start()
|
||||
def _work(self):
|
||||
self.prserv = PRServer(self.dbfile, self.logfile, self.interface, False)
|
||||
self.host, self.port = self.prserv.getinfo()
|
||||
self.event.set()
|
||||
self.prserv.work_forever()
|
||||
del self.prserv.db
|
||||
|
||||
def start(self):
|
||||
self.working_thread = threading.Thread(target=self._work)
|
||||
self.working_thread.start()
|
||||
|
||||
def getinfo(self):
|
||||
self.event.wait()
|
||||
return (self.host, self.port)
|
||||
|
||||
class PRServerConnection():
|
||||
@@ -190,7 +194,6 @@ class PRServerConnection():
|
||||
import socket
|
||||
socket.setdefaulttimeout(2)
|
||||
try:
|
||||
logger.info("Terminating PRServer...")
|
||||
self.connection.quit()
|
||||
except Exception as exc:
|
||||
sys.stderr.write("%s\n" % str(exc))
|
||||
@@ -263,27 +266,17 @@ def is_local_special(host, port):
|
||||
else:
|
||||
return False
|
||||
|
||||
class PRServiceConfigError(Exception):
|
||||
pass
|
||||
|
||||
def auto_start(d):
|
||||
global singleton
|
||||
|
||||
host_params = filter(None, (d.getVar('PRSERV_HOST', True) or '').split(':'))
|
||||
if not host_params:
|
||||
if (not d.getVar('PRSERV_HOST', True)) or (not d.getVar('PRSERV_PORT', True)):
|
||||
return True
|
||||
|
||||
if len(host_params) != 2:
|
||||
logger.critical('\n'.join(['PRSERV_HOST: incorrect format',
|
||||
'Usage: PRSERV_HOST = "<hostname>:<port>"']))
|
||||
raise PRServiceConfigError
|
||||
|
||||
if is_local_special(host_params[0], int(host_params[1])) and not singleton:
|
||||
if is_local_special(d.getVar('PRSERV_HOST', True), int(d.getVar('PRSERV_PORT', True))) and not singleton:
|
||||
import bb.utils
|
||||
cachedir = (d.getVar("PERSISTENT_DIR", True) or d.getVar("CACHE", True))
|
||||
if not cachedir:
|
||||
logger.critical("Please set the 'PERSISTENT_DIR' or 'CACHE' variable")
|
||||
raise PRServiceConfigError
|
||||
sys.exit(1)
|
||||
bb.utils.mkdirhier(cachedir)
|
||||
dbfile = os.path.join(cachedir, "prserv.sqlite3")
|
||||
logfile = os.path.join(cachedir, "prserv.log")
|
||||
@@ -292,14 +285,14 @@ def auto_start(d):
|
||||
if singleton:
|
||||
host, port = singleton.getinfo()
|
||||
else:
|
||||
host = host_params[0]
|
||||
port = int(host_params[1])
|
||||
host = d.getVar('PRSERV_HOST', True)
|
||||
port = int(d.getVar('PRSERV_PORT', True))
|
||||
|
||||
try:
|
||||
return PRServerConnection(host,port).ping()
|
||||
except Exception:
|
||||
logger.critical("PRservice %s:%d not available" % (host, port))
|
||||
raise PRServiceConfigError
|
||||
return False
|
||||
|
||||
def auto_shutdown(d=None):
|
||||
global singleton
|
||||
|
||||
@@ -1,93 +1,98 @@
|
||||
# This is a single Makefile to handle all generated Yocto Project documents.
|
||||
# The Makefile needs to live in the documents directory and all figures used
|
||||
# in any manuals must be .PNG files and live in the individual book's figures
|
||||
# directory as well as in the figures directory for the mega-manual.
|
||||
# directory as well as in the figures directory for the mega-manual.
|
||||
# Note that the figures for the Yocto Project Development Manual
|
||||
# differ depending on the BRANCH being built.
|
||||
#
|
||||
# The Makefile has these targets:
|
||||
#
|
||||
# pdf: generates a PDF version of a manual. Not valid for the Quick Start
|
||||
# or the mega-manual (single, large HTML file comprised of all
|
||||
# or the mega-manual (single, large HTML file comprised of all
|
||||
# Yocto Project manuals).
|
||||
# html: generates an HTML version of a manual.
|
||||
# eclipse: generates an HTML version of a manual that can be used as
|
||||
# eclipse help (including necessary metadata files).
|
||||
# tarball: creates a tarball for the doc files.
|
||||
# validate: validates
|
||||
# validate: validates
|
||||
# publish: pushes generated files to the Yocto Project website
|
||||
# clean: removes files
|
||||
#
|
||||
# The Makefile generates an HTML and PDF version of every document except the
|
||||
# Yocto Project Quick Start and the single, HTML mega-manual, which is comprised
|
||||
# of all the individual Yocto Project manuals. These two manuals are in HTML
|
||||
# form only. The variable DOC indicates the folder name for a given manual. The
|
||||
# variable VER represents the distro version of the Yocto Release for which the
|
||||
# manuals are being generated. The variable BRANCH is used to indicate the
|
||||
# branch (edison or denzil) and is used only when DOC=dev-manual or
|
||||
# DOC=mega-manual. If you do not specify a BRANCH, the default branch used
|
||||
# will be for the latest Yocto Project release. If you build for either
|
||||
# of all the individual Yocto Project manuals. These two manuals are in HTML
|
||||
# form only. The variable DOC indicates the folder name for a given manual. The
|
||||
# variable VER represents the distro version of the Yocto Release for which the
|
||||
# manuals are being generated. The variable BRANCH is used to indicate the
|
||||
# branch (edison or denzil) and is used only when DOC=dev-manual or
|
||||
# DOC=mega-manual. If you do not specify a BRANCH, the default branch used
|
||||
# will be for the latest Yocto Project release. If you build for either
|
||||
# edison or denzil, you must use BRANCH. You do not need to use BRANCH for
|
||||
# any release beyond denzil.
|
||||
# any release beyond denzil.
|
||||
#
|
||||
# To build a manual, you must invoke Makefile with the DOC argument. If you
|
||||
# are going to publish the manual, then you must invoke Makefile with both the
|
||||
# To build a manual, you must invoke Makefile with the DOC argument. If you
|
||||
# are going to publish the manual, then you must invoke Makefile with both the
|
||||
# DOC and the VER argument. Furthermore, if you are building or publishing
|
||||
# the edison or denzil versions of the Yocto Poject Development Manual or
|
||||
# the mega-manual, you must also use the BRANCH argument.
|
||||
# the edison or denzil versions of the Yocto Poject Development Manual or
|
||||
# the mega-manual, you must also use the BRANCH argument.
|
||||
#
|
||||
# Examples:
|
||||
#
|
||||
# make DOC=bsp-guide
|
||||
# make DOC=yocto-project-qs
|
||||
# make pdf DOC=ref-manual
|
||||
# make pdf DOC=poky-ref-manual
|
||||
# make DOC=dev-manual BRANCH=edison
|
||||
# make DOC=mega-manual BRANCH=denzil
|
||||
#
|
||||
# The first example generates the HTML and PDF versions of the BSP Guide.
|
||||
# The second example generates the HTML version only of the Quick Start. Note that
|
||||
# the Quick Start only has an HTML version available. The third example generates
|
||||
# both the PDF and HTML versions of the Yocto Project Reference Manual. The
|
||||
# fourth example generates both the PDF and HTML 'edison' versions of the YP
|
||||
# Development Manual. The last exmample generates the HTML version of the
|
||||
# mega-manual and uses the 'denzil' branch when choosing figures for the
|
||||
# tarball of figures. Any example that does not use the BRANCH argument
|
||||
# The second example generates the HTML version only of the Quick Start. Note that
|
||||
# the Quick Start only has an HTML version available. The third example generates
|
||||
# both the PDF and HTML versions of the Yocto Project Reference Manual. The
|
||||
# fourth example generates both the PDF and HTML 'edison' versions of the YP
|
||||
# Development Manual. The last exmample generates the HTML version of the
|
||||
# mega-manual and uses the 'denzil' branch when choosing figures for the
|
||||
# tarball of figures. Any example that does not use the BRANCH argument
|
||||
# builds the current version of the manual set.
|
||||
#
|
||||
# Use the publish target to push the generated manuals to the Yocto Project
|
||||
# website. All files needed for the manual's HTML form are pushed as well as the
|
||||
# PDF version (if applicable).
|
||||
# Use the publish target to push the generated manuals to the Yocto Project
|
||||
# website. All files needed for the manual's HTML form are pushed as well as the
|
||||
# PDF version (if applicable).
|
||||
# Examples:
|
||||
#
|
||||
# make publish DOC=bsp-guide VER=1.3
|
||||
# make publish DOC=adt-manual VER=1.3
|
||||
# make publish DOC=dev-manual VER=1.1.1 BRANCH=edison
|
||||
# make publish DOC=dev-manual VER=1.2 BRANCH=denzil
|
||||
# make publish DOC=dev-manual VER=1.2 BRANCH=denzil
|
||||
#
|
||||
# The first example publishes the 1.3 version of both the PDF and HTML versions of
|
||||
# the BSP Guide. The second example publishes the 1.3 version of both the PDF and
|
||||
# The first example publishes the 1.3 version of both the PDF and HTML versions of
|
||||
# the BSP Guide. The second example publishes the 1.3 version of both the PDF and
|
||||
# HTML versions of the ADT Manual. The third example publishes the PDF and HTML
|
||||
# 'edison' versions of the YP Development Manual. The fourth example publishes
|
||||
# the PDF and HTML 'denzil' versions of the YP Development Manual.
|
||||
#
|
||||
|
||||
ifeq ($(DOC),bsp-guide)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
TARFILES = bsp-style.css bsp-guide.html bsp-guide.pdf figures/bsp-title.png \
|
||||
eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
XSLTOPTS = --stringparam html.stylesheet bsp-style.css \
|
||||
--stringparam chapter.autolabel 1 \
|
||||
--stringparam section.autolabel 1 \
|
||||
--stringparam section.label.includes.component.label 1 \
|
||||
--xinclude
|
||||
ALLPREQ = html pdf tarball
|
||||
TARFILES = bsp-style.css bsp-guide.html bsp-guide.pdf figures/bsp-title.png
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
|
||||
endif
|
||||
|
||||
ifeq ($(DOC),dev-manual)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
XSLTOPTS = --stringparam html.stylesheet dev-style.css \
|
||||
--stringparam chapter.autolabel 1 \
|
||||
--stringparam section.autolabel 1 \
|
||||
--stringparam section.label.includes.component.label 1 \
|
||||
--xinclude
|
||||
ALLPREQ = html pdf tarball
|
||||
#
|
||||
# Note that the tarfile might produce the "Cannot stat: No such file or directory" error
|
||||
# message for .PNG files that are not present when building a particular branch. The
|
||||
# message for .PNG files that are not present when building a particular branch. The
|
||||
# list of files is all-inclusive for all branches. Note, if you don't provide a BRANCH
|
||||
# option, it defaults to the latest stuff. This would be appropriate for "master" branch.
|
||||
#
|
||||
@@ -115,24 +120,21 @@ TARFILES = dev-style.css dev-manual.html dev-manual.pdf \
|
||||
figures/app-dev-flow.png figures/bsp-dev-flow.png figures/dev-title.png \
|
||||
figures/git-workflow.png figures/index-downloads.png figures/kernel-dev-flow.png \
|
||||
figures/kernel-overview-1.png figures/kernel-overview-2-generic.png \
|
||||
figures/source-repos.png figures/yp-download.png \
|
||||
eclipse
|
||||
figures/source-repos.png figures/yp-download.png
|
||||
endif
|
||||
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
|
||||
endif
|
||||
|
||||
ifeq ($(DOC),yocto-project-qs)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html eclipse tarball
|
||||
TARFILES = yocto-project-qs.html qs-style.css figures/yocto-environment.png \
|
||||
figures/building-an-image.png figures/using-a-pre-built-image.png \
|
||||
figures/yocto-project-transp.png \
|
||||
eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/eclipse
|
||||
XSLTOPTS = --stringparam html.stylesheet qs-style.css \
|
||||
--xinclude
|
||||
ALLPREQ = html tarball
|
||||
TARFILES = yocto-project-qs.html qs-style.css figures/yocto-environment.png figures/building-an-image.png figures/using-a-pre-built-image.png figures/yocto-project-transp.png
|
||||
MANUALS = $(DOC)/$(DOC).html
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
endif
|
||||
@@ -174,29 +176,13 @@ TARFILES = mega-manual.html mega-style.css figures/yocto-environment.png figures
|
||||
else
|
||||
TARFILES = mega-manual.html mega-style.css figures/yocto-environment.png figures/building-an-image.png \
|
||||
figures/using-a-pre-built-image.png \
|
||||
figures/poky-title.png figures/buildhistory.png figures/buildhistory-web.png \
|
||||
figures/poky-title.png \
|
||||
figures/adt-title.png figures/bsp-title.png \
|
||||
figures/kernel-dev-title.png figures/kernel-architecture-overview.png \
|
||||
figures/kernel-title.png figures/kernel-architecture-overview.png \
|
||||
figures/app-dev-flow.png figures/bsp-dev-flow.png figures/dev-title.png \
|
||||
figures/git-workflow.png figures/index-downloads.png figures/kernel-dev-flow.png \
|
||||
figures/kernel-overview-1.png figures/kernel-overview-2-generic.png \
|
||||
figures/source-repos.png figures/yp-download.png \
|
||||
figures/profile-title.png figures/kernelshark-all.png \
|
||||
figures/kernelshark-choose-events.png figures/kernelshark-i915-display.png \
|
||||
figures/kernelshark-output-display.png figures/lttngmain0.png \
|
||||
figures/oprofileui-busybox.png figures/oprofileui-copy-to-user.png \
|
||||
figures/oprofileui-downloading.png figures/oprofileui-processes.png \
|
||||
figures/perf-probe-do_fork-profile.png figures/perf-report-cycles-u.png \
|
||||
figures/perf-systemwide.png figures/perf-systemwide-libc.png \
|
||||
figures/perf-wget-busybox-annotate-menu.png figures/perf-wget-busybox-annotate-udhcpc.png \
|
||||
figures/perf-wget-busybox-debuginfo.png figures/perf-wget-busybox-dso-zoom.png \
|
||||
figures/perf-wget-busybox-dso-zoom-menu.png figures/perf-wget-busybox-expanded-stripped.png \
|
||||
figures/perf-wget-flat-stripped.png figures/perf-wget-g-copy-from-user-expanded-stripped.png \
|
||||
figures/perf-wget-g-copy-to-user-expanded-debuginfo.png figures/perf-wget-g-copy-to-user-expanded-stripped.png \
|
||||
figures/perf-wget-g-copy-to-user-expanded-stripped-unresolved-hidden.png figures/pybootchartgui-linux-yocto.png \
|
||||
figures/pychart-linux-yocto-rpm.png figures/pychart-linux-yocto-rpm-nostrip.png \
|
||||
figures/sched-wakeup-profile.png figures/sysprof-callers.png \
|
||||
figures/sysprof-copy-from-user.png figures/sysprof-copy-to-user.png
|
||||
figures/source-repos.png figures/yp-download.png
|
||||
endif
|
||||
|
||||
MANUALS = $(DOC)/$(DOC).html
|
||||
@@ -205,60 +191,45 @@ STYLESHEET = $(DOC)/*.css
|
||||
|
||||
endif
|
||||
|
||||
ifeq ($(DOC),ref-manual)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
TARFILES = ref-manual.html ref-style.css figures/poky-title.png \
|
||||
figures/buildhistory.png figures/buildhistory-web.png eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
ifeq ($(DOC),poky-ref-manual)
|
||||
XSLTOPTS = --stringparam html.stylesheet ref-style.css \
|
||||
--stringparam chapter.autolabel 1 \
|
||||
--stringparam appendix.autolabel A \
|
||||
--stringparam section.autolabel 1 \
|
||||
--stringparam section.label.includes.component.label 1 \
|
||||
--xinclude
|
||||
ALLPREQ = html pdf tarball
|
||||
TARFILES = poky-ref-manual.html ref-style.css figures/poky-title.png
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
endif
|
||||
|
||||
|
||||
ifeq ($(DOC),adt-manual)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
TARFILES = adt-manual.html adt-manual.pdf adt-style.css figures/adt-title.png \
|
||||
eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
XSLTOPTS = --stringparam html.stylesheet adt-style.css \
|
||||
--stringparam chapter.autolabel 1 \
|
||||
--stringparam appendix.autolabel A \
|
||||
--stringparam section.autolabel 1 \
|
||||
--stringparam section.label.includes.component.label 1 \
|
||||
--xinclude
|
||||
ALLPREQ = html pdf tarball
|
||||
TARFILES = adt-manual.html adt-manual.pdf adt-style.css figures/adt-title.png
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
endif
|
||||
|
||||
ifeq ($(DOC),profile-manual)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
TARFILES = profile-manual.html profile-manual.pdf profile-manual-style.css \
|
||||
figures/profile-title.png figures/kernelshark-all.png \
|
||||
figures/kernelshark-choose-events.png figures/kernelshark-i915-display.png \
|
||||
figures/kernelshark-output-display.png figures/lttngmain0.png \
|
||||
figures/oprofileui-busybox.png figures/oprofileui-copy-to-user.png \
|
||||
figures/oprofileui-downloading.png figures/oprofileui-processes.png \
|
||||
figures/perf-probe-do_fork-profile.png figures/perf-report-cycles-u.png \
|
||||
figures/perf-systemwide.png figures/perf-systemwide-libc.png \
|
||||
figures/perf-wget-busybox-annotate-menu.png figures/perf-wget-busybox-annotate-udhcpc.png \
|
||||
figures/perf-wget-busybox-debuginfo.png figures/perf-wget-busybox-dso-zoom.png \
|
||||
figures/perf-wget-busybox-dso-zoom-menu.png figures/perf-wget-busybox-expanded-stripped.png \
|
||||
figures/perf-wget-flat-stripped.png figures/perf-wget-g-copy-from-user-expanded-stripped.png \
|
||||
figures/perf-wget-g-copy-to-user-expanded-debuginfo.png figures/perf-wget-g-copy-to-user-expanded-stripped.png \
|
||||
figures/perf-wget-g-copy-to-user-expanded-stripped-unresolved-hidden.png figures/pybootchartgui-linux-yocto.png \
|
||||
figures/pychart-linux-yocto-rpm.png figures/pychart-linux-yocto-rpm-nostrip.png \
|
||||
figures/sched-wakeup-profile.png figures/sysprof-callers.png \
|
||||
figures/sysprof-copy-from-user.png figures/sysprof-copy-to-user.png \
|
||||
eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
endif
|
||||
|
||||
ifeq ($(DOC),kernel-dev)
|
||||
XSLTOPTS = --xinclude
|
||||
ALLPREQ = html pdf eclipse tarball
|
||||
TARFILES = kernel-dev.html kernel-dev.pdf kernel-dev-style.css figures/kernel-dev-title.png \
|
||||
figures/kernel-architecture-overview.png \
|
||||
eclipse
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf $(DOC)/eclipse
|
||||
ifeq ($(DOC),kernel-manual)
|
||||
XSLTOPTS = --stringparam html.stylesheet kernel-style.css \
|
||||
--stringparam chapter.autolabel 1 \
|
||||
--stringparam appendix.autolabel A \
|
||||
--stringparam section.autolabel 1 \
|
||||
--stringparam section.label.includes.component.label 1 \
|
||||
--xinclude
|
||||
ALLPREQ = html pdf tarball
|
||||
TARFILES = kernel-manual.html kernel-manual.pdf kernel-style.css figures/kernel-title.png figures/kernel-architecture-overview.png
|
||||
MANUALS = $(DOC)/$(DOC).html $(DOC)/$(DOC).pdf
|
||||
FIGURES = figures
|
||||
STYLESHEET = $(DOC)/*.css
|
||||
endif
|
||||
@@ -287,10 +258,10 @@ else
|
||||
|
||||
cd $(DOC); ../tools/poky-docbook-to-pdf $(DOC).xml ../template; cd ..
|
||||
endif
|
||||
|
||||
|
||||
html:
|
||||
ifeq ($(DOC),mega-manual)
|
||||
# See http://www.sagehill.net/docbookxsl/HtmlOutput.html
|
||||
# See http://www.sagehill.net/docbookxsl/HtmlOutput.html
|
||||
@echo " "
|
||||
@echo "******** Building "$(DOC)
|
||||
@echo " "
|
||||
@@ -304,7 +275,7 @@ ifeq ($(DOC),mega-manual)
|
||||
@echo " "
|
||||
cd $(DOC); rm mega-manual.html; mv mega-output.html mega-manual.html; cd ..
|
||||
else
|
||||
# See http://www.sagehill.net/docbookxsl/HtmlOutput.html
|
||||
# See http://www.sagehill.net/docbookxsl/HtmlOutput.html
|
||||
@echo " "
|
||||
@echo "******** Building "$(DOC)
|
||||
@echo " "
|
||||
@@ -312,49 +283,6 @@ else
|
||||
endif
|
||||
|
||||
|
||||
eclipse: BASE_DIR = html/$(DOC)/
|
||||
|
||||
eclipse: eclipse-generate eclipse-resolve-links
|
||||
|
||||
.PHONY : eclipse-generate eclipse-resolve-links
|
||||
|
||||
eclipse-generate:
|
||||
ifeq ($(filter $(DOC), adt-manual bsp-guide dev-manual kernel-dev profile-manual ref-manual yocto-project-qs),)
|
||||
@echo " "
|
||||
@echo "ERROR: You can only create eclipse documentation"
|
||||
@echo " of the following documentation parts:"
|
||||
@echo " - adt-manual"
|
||||
@echo " - bsp-guide"
|
||||
@echo " - dev-manual"
|
||||
@echo " - kernel-dev"
|
||||
@echo " - profile-manual"
|
||||
@echo " - ref-manual"
|
||||
@echo " - yocto-project-qs"
|
||||
@echo " "
|
||||
else
|
||||
@echo " "
|
||||
@echo "******** Building eclipse help of "$(DOC)
|
||||
@echo " "
|
||||
cd $(DOC) && \
|
||||
xsltproc $(XSLTOPTS) \
|
||||
--stringparam base.dir '$(BASE_DIR)' \
|
||||
-o eclipse/$(DOC).html \
|
||||
$(DOC)-eclipse-customization.xsl $(DOC).xml && \
|
||||
mv eclipse/toc.xml eclipse/$(DOC)-toc.xml && \
|
||||
cp -rf $(FIGURES) eclipse/$(BASE_DIR) && \
|
||||
cd ..;
|
||||
|
||||
$(call modify-eclipse)
|
||||
endif
|
||||
|
||||
eclipse-resolve-links:
|
||||
@echo " "
|
||||
@echo "******** Using eclipse-help.sed to process external links"
|
||||
@echo " "
|
||||
$(foreach FILE, \
|
||||
$(wildcard $(DOC)/eclipse/html/$(DOC)/*.html), \
|
||||
$(shell sed -i -f tools/eclipse-help.sed $(FILE)))
|
||||
|
||||
tarball: html
|
||||
@echo " "
|
||||
@echo "******** Creating Tarball of document files"
|
||||
@@ -380,4 +308,4 @@ publish:
|
||||
fi
|
||||
|
||||
clean:
|
||||
rm -rf $(MANUALS); rm $(DOC)/$(DOC).tgz;
|
||||
rm -f $(MANUALS)
|
||||
|
||||
@@ -6,22 +6,22 @@
|
||||
<title>Using the Command Line</title>
|
||||
|
||||
<para>
|
||||
Recall that earlier the manual discussed how to use an existing toolchain
|
||||
tarball that had been installed into the default installation
|
||||
directory, <filename>/opt/poky</filename>, which is outside of the
|
||||
Recall that earlier the manual discussed how to use an existing toolchain
|
||||
tarball that had been installed into <filename>/opt/poky</filename>,
|
||||
which is outside of the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>
|
||||
(see the section "<link linkend='using-an-existing-toolchain-tarball'>Using a Cross-Toolchain Tarball)</link>".
|
||||
And, that sourcing your architecture-specific environment setup script
|
||||
initializes a suitable cross-toolchain development environment.
|
||||
During the setup, locations for the compiler, QEMU scripts, QEMU binary,
|
||||
a special version of <filename>pkgconfig</filename> and other useful
|
||||
(see the section "<link linkend='using-an-existing-toolchain-tarball'>Using a Cross-Toolchain Tarball)</link>".
|
||||
And, that sourcing your architecture-specific environment setup script
|
||||
initializes a suitable cross-toolchain development environment.
|
||||
During the setup, locations for the compiler, QEMU scripts, QEMU binary,
|
||||
a special version of <filename>pkgconfig</filename> and other useful
|
||||
utilities are added to the <filename>PATH</filename> variable.
|
||||
Variables to assist <filename>pkgconfig</filename> and <filename>autotools</filename>
|
||||
are also defined so that,
|
||||
for example, <filename>configure.sh</filename> can find pre-generated
|
||||
test results for tests that need target hardware on which to run.
|
||||
These conditions allow you to easily use the toolchain outside of the
|
||||
OpenEmbedded build environment on both autotools-based projects and
|
||||
Variables to assist <filename>pkgconfig</filename> and <filename>autotools</filename>
|
||||
are also defined so that,
|
||||
for example, <filename>configure.sh</filename> can find pre-generated
|
||||
test results for tests that need target hardware on which to run.
|
||||
These conditions allow you to easily use the toolchain outside of the
|
||||
OpenEmbedded build environment on both autotools-based projects and
|
||||
Makefile-based projects.
|
||||
</para>
|
||||
|
||||
@@ -29,176 +29,47 @@
|
||||
<title>Autotools-Based Projects</title>
|
||||
|
||||
<para>
|
||||
Once you have a suitable cross-toolchain installed, it is very easy to
|
||||
develop a project outside of the OpenEmbedded build system.
|
||||
This section presents a simple "Helloworld" example that shows how
|
||||
to set up, compile, and run the project.
|
||||
</para>
|
||||
|
||||
<section id='creating-and-running-a-project-based-on-gnu-autotools'>
|
||||
<title>Creating and Running a Project Based on GNU Autotools</title>
|
||||
|
||||
<para>
|
||||
Follow these steps to create a simple autotools-based project:
|
||||
<orderedlist>
|
||||
<listitem><para><emphasis>Create your directory:</emphasis>
|
||||
Create a clean directory for your project and then make
|
||||
that directory your working location:
|
||||
<literallayout class='monospaced'>
|
||||
$ mkdir $HOME/helloworld
|
||||
$ cd $HOME/helloworld
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Populate the directory:</emphasis>
|
||||
Create <filename>hello.c</filename>, <filename>Makefile.am</filename>,
|
||||
and <filename>configure.in</filename> files as follows:
|
||||
<itemizedlist>
|
||||
<listitem><para>For <filename>hello.c</filename>, include
|
||||
these lines:
|
||||
<literallayout class='monospaced'>
|
||||
#include <stdio.h>
|
||||
|
||||
main()
|
||||
{
|
||||
printf("Hello World!\n");
|
||||
}
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para>For <filename>Makefile.am</filename>,
|
||||
include these lines:
|
||||
<literallayout class='monospaced'>
|
||||
bin_PROGRAMS = hello
|
||||
hello_SOURCES = hello.c
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para>For <filename>configure.in</filename>,
|
||||
include these lines:
|
||||
<literallayout class='monospaced'>
|
||||
AC_INIT(hello.c)
|
||||
AM_INIT_AUTOMAKE(hello,0.1)
|
||||
AC_PROG_CC
|
||||
AC_PROG_INSTALL
|
||||
AC_OUTPUT(Makefile)
|
||||
</literallayout></para></listitem>
|
||||
</itemizedlist></para></listitem>
|
||||
<listitem><para><emphasis>Source the cross-toolchain
|
||||
environment setup file:</emphasis>
|
||||
Installation of the cross-toolchain creates a cross-toolchain
|
||||
environment setup script in the directory that the ADT
|
||||
was installed.
|
||||
Before you can use the tools to develop your project, you must
|
||||
source this setup script.
|
||||
The script begins with the string "environment-setup" and contains
|
||||
the machine architecture, which is followed by the string
|
||||
"poky-linux".
|
||||
Here is an example that sources a script from the
|
||||
default ADT installation directory that uses the
|
||||
32-bit Intel x86 Architecture and using the
|
||||
&DISTRO_NAME; Yocto Project release:
|
||||
<literallayout class='monospaced'>
|
||||
$ source /opt/poky/&DISTRO;/environment-setup-i586-poky-linux
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Generate the local aclocal.m4
|
||||
files and create the configure script:</emphasis>
|
||||
The following GNU Autotools generate the local
|
||||
<filename>aclocal.m4</filename> files and create the
|
||||
configure script:
|
||||
<literallayout class='monospaced'>
|
||||
$ aclocal
|
||||
$ autoconf
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Generate files needed by GNU
|
||||
coding standards:</emphasis>
|
||||
GNU coding standards require certain files in order for the
|
||||
project to be compliant.
|
||||
This command creates those files:
|
||||
<literallayout class='monospaced'>
|
||||
$ touch NEWS README AUTHORS ChangeLog
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Generate the configure
|
||||
file:</emphasis>
|
||||
This command generates the <filename>configure</filename>:
|
||||
<literallayout class='monospaced'>
|
||||
$ automake -a
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Cross-compile the project:</emphasis>
|
||||
This command compiles the project using the cross-compiler:
|
||||
<literallayout class='monospaced'>
|
||||
$ ./configure ${CONFIGURE_FLAGS}
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Make and install the project:</emphasis>
|
||||
These two commands generate and install the project into the
|
||||
destination directory:
|
||||
<literallayout class='monospaced'>
|
||||
$ make
|
||||
$ make install DESTDIR=./tmp
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Verify the installation:</emphasis>
|
||||
This command is a simple way to verify the installation
|
||||
of your project.
|
||||
Running the command prints the architecture on which
|
||||
the binary file can run.
|
||||
This architecture should be the same architecture that
|
||||
the installed cross-toolchain supports.
|
||||
<literallayout class='monospaced'>
|
||||
$ file ./tmp/usr/local/bin/hello
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Execute your project:</emphasis>
|
||||
To execute the project in the shell, simply enter the name.
|
||||
You could also copy the binary to the actual target hardware
|
||||
and run the project there as well:
|
||||
<literallayout class='monospaced'>
|
||||
$ ./hello
|
||||
</literallayout>
|
||||
As expected, the project displays the "Hello World!" message.
|
||||
</para></listitem>
|
||||
</orderedlist>
|
||||
</para>
|
||||
</section>
|
||||
|
||||
<section id='passing-host-options'>
|
||||
<title>Passing Host Options</title>
|
||||
|
||||
<para>
|
||||
For an Autotools-based project, you can use the cross-toolchain by just
|
||||
passing the appropriate host option to <filename>configure.sh</filename>.
|
||||
The host option you use is derived from the name of the environment setup
|
||||
script found in the directory in which you installed the cross-toolchain.
|
||||
For example, the host option for an ARM-based target that uses the GNU EABI
|
||||
is <filename>armv5te-poky-linux-gnueabi</filename>.
|
||||
You will notice that the name of the script is
|
||||
<filename>environment-setup-armv5te-poky-linux-gnueabi</filename>.
|
||||
Thus, the following command works:
|
||||
<literallayout class='monospaced'>
|
||||
$ ./configure --host=armv5te-poky-linux-gnueabi \
|
||||
For an Autotools-based project, you can use the cross-toolchain by just
|
||||
passing the appropriate host option to <filename>configure.sh</filename>.
|
||||
The host option you use is derived from the name of the environment setup
|
||||
script in <filename>/opt/poky</filename> resulting from installation of the
|
||||
cross-toolchain tarball.
|
||||
For example, the host option for an ARM-based target that uses the GNU EABI
|
||||
is <filename>armv5te-poky-linux-gnueabi</filename>.
|
||||
Note that the name of the script is
|
||||
<filename>environment-setup-armv5te-poky-linux-gnueabi</filename>.
|
||||
Thus, the following command works:
|
||||
<literallayout class='monospaced'>
|
||||
$ configure --host=armv5te-poky-linux-gnueabi \
|
||||
--with-libtool-sysroot=<sysroot-dir>
|
||||
</literallayout>
|
||||
</para>
|
||||
|
||||
<para>
|
||||
This single command updates your project and rebuilds it using the appropriate
|
||||
cross-toolchain tools.
|
||||
<note>
|
||||
If <filename>configure</filename> script results in problems recognizing the
|
||||
<filename>--with-libtool-sysroot=<sysroot-dir></filename> option,
|
||||
regenerate the script to enable the support by doing the following and then
|
||||
re-running the script:
|
||||
<literallayout class='monospaced'>
|
||||
</literallayout>
|
||||
</para>
|
||||
<para>
|
||||
This single command updates your project and rebuilds it using the appropriate
|
||||
cross-toolchain tools.
|
||||
</para>
|
||||
<note>
|
||||
If <filename>configure</filename> script results in problems recognizing the
|
||||
<filename>--with-libtool-sysroot=<sysroot-dir></filename> option,
|
||||
regenerate the script to enable the support by doing the following and then
|
||||
re-running the script:
|
||||
<literallayout class='monospaced'>
|
||||
$ libtoolize --automake
|
||||
$ aclocal -I ${OECORE_NATIVE_SYSROOT}/usr/share/aclocal \
|
||||
[-I <dir_containing_your_project-specific_m4_macros>]
|
||||
$ autoconf
|
||||
$ autoheader
|
||||
$ automake -a
|
||||
</literallayout>
|
||||
</note>
|
||||
</para>
|
||||
</section>
|
||||
</literallayout>
|
||||
</note>
|
||||
</section>
|
||||
|
||||
<section id='makefile-based-projects'>
|
||||
<title>Makefile-Based Projects</title>
|
||||
|
||||
<para>
|
||||
For a Makefile-based project, you use the cross-toolchain by making sure
|
||||
the tools are used.
|
||||
For a Makefile-based project, you use the cross-toolchain by making sure
|
||||
the tools are used.
|
||||
You can do this as follows:
|
||||
<literallayout class='monospaced'>
|
||||
CC=arm-poky-linux-gnueabi-gcc
|
||||
|
||||
@@ -6,70 +6,59 @@
|
||||
<title>Introduction</title>
|
||||
|
||||
<para>
|
||||
Welcome to the Yocto Project Application Developer's Guide.
|
||||
This manual provides information that lets you begin developing applications
|
||||
Welcome to the Yocto Project Application Developer's Guide.
|
||||
This manual provides information that lets you begin developing applications
|
||||
using the Yocto Project.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The Yocto Project provides an application development environment based on
|
||||
The Yocto Project provides an application development environment based on
|
||||
an Application Development Toolkit (ADT) and the availability of stand-alone
|
||||
cross-development toolchains and other tools.
|
||||
This manual describes the ADT and how you can configure and install it,
|
||||
how to access and use the cross-development toolchains, how to
|
||||
how to access and use the cross-development toolchains, how to
|
||||
customize the development packages installation,
|
||||
how to use command line development for both Autotools-based and Makefile-based projects,
|
||||
and an introduction to the <trademark class='trade'>Eclipse</trademark> IDE
|
||||
Yocto Plug-in.
|
||||
<note>
|
||||
The ADT is distribution-neutral and does not require the Yocto
|
||||
Project reference distribution, which is called Poky.
|
||||
This manual, however, uses examples that use the Poky distribution.
|
||||
</note>
|
||||
how to use command line development for both Autotools-based and Makefile-based projects,
|
||||
and an introduction to the Eclipse Yocto Plug-in.
|
||||
</para>
|
||||
|
||||
<section id='adt-intro-section'>
|
||||
<title>The Application Development Toolkit (ADT)</title>
|
||||
|
||||
<para>
|
||||
Part of the Yocto Project development solution is an Application Development
|
||||
Part of the Yocto Project development solution is an Application Development
|
||||
Toolkit (ADT).
|
||||
The ADT provides you with a custom-built, cross-development
|
||||
The ADT provides you with a custom-built, cross-development
|
||||
platform suited for developing a user-targeted product application.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Fundamentally, the ADT consists of the following:
|
||||
<itemizedlist>
|
||||
<listitem><para>An architecture-specific cross-toolchain and matching
|
||||
sysroot both built by the OpenEmbedded build system.
|
||||
The toolchain and sysroot are based on a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#metadata'>Metadata</ulink>
|
||||
configuration and extensions,
|
||||
<listitem><para>An architecture-specific cross-toolchain and matching
|
||||
sysroot both built by the OpenEmbedded build system, which uses Poky.
|
||||
The toolchain and sysroot are based on a metadata configuration and extensions,
|
||||
which allows you to cross-develop on the host machine for the target hardware.
|
||||
</para></listitem>
|
||||
<listitem><para>The Eclipse IDE Yocto Plug-in.</para></listitem>
|
||||
<listitem><para>The Quick EMUlator (QEMU), which lets you simulate target hardware.
|
||||
</para></listitem>
|
||||
<listitem><para>Various user-space tools that greatly enhance your application
|
||||
<listitem><para>Various user-space tools that greatly enhance your application
|
||||
development experience.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
|
||||
<section id='the-cross-development-toolchain'>
|
||||
<title>The Cross-Development Toolchain</title>
|
||||
<section id='the-cross-toolchain'>
|
||||
<title>The Cross-Toolchain</title>
|
||||
|
||||
<para>
|
||||
The
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#cross-development-toolchain'>Cross-Development Toolchain</ulink>
|
||||
consists of a cross-compiler, cross-linker, and cross-debugger
|
||||
that are used to develop user-space applications for targeted
|
||||
hardware.
|
||||
This toolchain is created either by running the ADT Installer
|
||||
script, a toolchain installer script, or through a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>
|
||||
that is based on your Metadata configuration or extension for
|
||||
your targeted device.
|
||||
The cross-toolchain consists of a cross-compiler, cross-linker, and cross-debugger
|
||||
that are used to develop user-space applications for targeted hardware.
|
||||
This toolchain is created either by running the ADT Installer script, a toolchain installer
|
||||
script, or through a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink> that
|
||||
is based on your metadata
|
||||
configuration or extension for your targeted device.
|
||||
The cross-toolchain works with a matching target sysroot.
|
||||
</para>
|
||||
</section>
|
||||
@@ -78,10 +67,10 @@
|
||||
<title>Sysroot</title>
|
||||
|
||||
<para>
|
||||
The matching target sysroot contains needed headers and libraries for generating
|
||||
binaries that run on the target architecture.
|
||||
The sysroot is based on the target root filesystem image that is built by
|
||||
the OpenEmbedded build system and uses the same Metadata configuration
|
||||
The matching target sysroot contains needed headers and libraries for generating
|
||||
binaries that run on the target architecture.
|
||||
The sysroot is based on the target root filesystem image that is built by
|
||||
the OpenEmbedded build system Poky and uses the same metadata configuration
|
||||
used to build the cross-toolchain.
|
||||
</para>
|
||||
</section>
|
||||
@@ -90,24 +79,24 @@
|
||||
<title>Eclipse Yocto Plug-in</title>
|
||||
|
||||
<para>
|
||||
The Eclipse IDE is a popular development environment and it fully supports
|
||||
development using the Yocto Project.
|
||||
When you install and configure the Eclipse Yocto Project Plug-in into
|
||||
the Eclipse IDE, you maximize your Yocto Project experience.
|
||||
Installing and configuring the Plug-in results in an environment that
|
||||
has extensions specifically designed to let you more easily develop software.
|
||||
These extensions allow for cross-compilation, deployment, and execution of
|
||||
your output into a QEMU emulation session.
|
||||
You can also perform cross-debugging and profiling.
|
||||
The environment also supports a suite of tools that allows you to perform
|
||||
remote profiling, tracing, collection of power data, collection of
|
||||
The Eclipse IDE is a popular development environment and it fully supports
|
||||
development using the Yocto Project.
|
||||
When you install and configure the Eclipse Yocto Project Plug-in into
|
||||
the Eclipse IDE, you maximize your Yocto Project experience.
|
||||
Installing and configuring the Plug-in results in an environment that
|
||||
has extensions specifically designed to let you more easily develop software.
|
||||
These extensions allow for cross-compilation, deployment, and execution of
|
||||
your output into a QEMU emulation session.
|
||||
You can also perform cross-debugging and profiling.
|
||||
The environment also supports a suite of tools that allows you to perform
|
||||
remote profiling, tracing, collection of power data, collection of
|
||||
latency data, and collection of performance data.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
For information about the application development workflow that uses the Eclipse
|
||||
IDE and for a detailed example of how to install and configure the Eclipse
|
||||
Yocto Project Plug-in, see the
|
||||
Yocto Project Plug-in, see the
|
||||
"<ulink url='&YOCTO_DOCS_DEV_URL;#adt-eclipse'>Working Within Eclipse</ulink>" section
|
||||
of the Yocto Project Development Manual.
|
||||
</para>
|
||||
@@ -117,20 +106,20 @@
|
||||
<title>The QEMU Emulator</title>
|
||||
|
||||
<para>
|
||||
The QEMU emulator allows you to simulate your hardware while running your
|
||||
The QEMU emulator allows you to simulate your hardware while running your
|
||||
application or image.
|
||||
QEMU is made available a number of ways:
|
||||
<itemizedlist>
|
||||
<listitem><para>If you use the ADT Installer script to install ADT, you can
|
||||
<listitem><para>If you use the ADT Installer script to install ADT, you can
|
||||
specify whether or not to install QEMU.</para></listitem>
|
||||
<listitem><para>If you have downloaded a Yocto Project release and unpacked
|
||||
it to create a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink> and
|
||||
you have sourced
|
||||
the environment setup script, QEMU is installed and automatically
|
||||
<listitem><para>If you have downloaded a Yocto Project release and unpacked
|
||||
it to create a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink> and
|
||||
you have sourced
|
||||
the environment setup script, QEMU is installed and automatically
|
||||
available.</para></listitem>
|
||||
<listitem><para>If you have installed the cross-toolchain
|
||||
tarball and you have sourced the toolchain's setup environment script, QEMU
|
||||
<listitem><para>If you have installed the cross-toolchain
|
||||
tarball and you have sourcing the toolchain's setup environment script, QEMU
|
||||
is also installed and automatically available.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
@@ -140,49 +129,38 @@
|
||||
<title>User-Space Tools</title>
|
||||
|
||||
<para>
|
||||
User-space tools are included as part of the distribution.
|
||||
You will find these tools helpful during development.
|
||||
The tools include LatencyTOP, PowerTOP, OProfile, Perf, SystemTap, and Lttng-ust.
|
||||
User-space tools are included as part of the distribution.
|
||||
You will find these tools helpful during development.
|
||||
The tools include LatencyTOP, PowerTOP, OProfile, Perf, SystemTap, and Lttng-ust.
|
||||
These tools are common development tools for the Linux platform.
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis>LatencyTOP:</emphasis> LatencyTOP focuses on latency
|
||||
<listitem><para><emphasis>LatencyTOP:</emphasis> LatencyTOP focuses on latency
|
||||
that causes skips in audio,
|
||||
stutters in your desktop experience, or situations that overload your server
|
||||
even when you have plenty of CPU power left.
|
||||
You can find out more about LatencyTOP at
|
||||
<ulink url='https://latencytop.org/'></ulink>.</para></listitem>
|
||||
<listitem><para><emphasis>PowerTOP:</emphasis> Helps you determine what
|
||||
software is using the most power.
|
||||
You can find out more about PowerTOP at
|
||||
stutters in your desktop experience, or situations that overload your server
|
||||
even when you have plenty of CPU power left.
|
||||
You can find out more about LatencyTOP at
|
||||
<ulink url='http://www.latencytop.org/'></ulink>.</para></listitem>
|
||||
<listitem><para><emphasis>PowerTOP:</emphasis> Helps you determine what
|
||||
software is using the most power.
|
||||
You can find out more about PowerTOP at
|
||||
<ulink url='https://01.org/powertop/'></ulink>.</para></listitem>
|
||||
<listitem><para><emphasis>OProfile:</emphasis> A system-wide profiler for Linux
|
||||
systems that is capable of profiling all running code at low overhead.
|
||||
You can find out more about OProfile at
|
||||
<ulink url='http://oprofile.sourceforge.net/about/'></ulink>.
|
||||
For examples on how to setup and use this tool, see the
|
||||
"<ulink url='&YOCTO_DOCS_PROF_URL;#profile-manual-oprofile'>OProfile</ulink>"
|
||||
section in the Yocto Project Profiling and Tracing Manual.
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>Perf:</emphasis> Performance counters for Linux used
|
||||
to keep track of certain types of hardware and software events.
|
||||
For more information on these types of counters see
|
||||
<ulink url='https://perf.wiki.kernel.org/'></ulink> and click
|
||||
on “Perf tools.”
|
||||
For examples on how to setup and use this tool, see the
|
||||
"<ulink url='&YOCTO_DOCS_PROF_URL;#profile-manual-perf'>perf</ulink>"
|
||||
section in the Yocto Project Profiling and Tracing Manual.
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>SystemTap:</emphasis> A free software infrastructure
|
||||
that simplifies information gathering about a running Linux system.
|
||||
This information helps you diagnose performance or functional problems.
|
||||
SystemTap is not available as a user-space tool through the Eclipse IDE Yocto Plug-in.
|
||||
See <ulink url='http://sourceware.org/systemtap'></ulink> for more information
|
||||
on SystemTap.
|
||||
For examples on how to setup and use this tool, see the
|
||||
"<ulink url='&YOCTO_DOCS_PROF_URL;#profile-manual-systemtap'>SystemTap</ulink>"
|
||||
section in the Yocto Project Profiling and Tracing Manual.</para></listitem>
|
||||
<listitem><para><emphasis>Lttng-ust:</emphasis> A User-space Tracer designed to
|
||||
provide detailed information on user-space activity.
|
||||
<listitem><para><emphasis>OProfile:</emphasis> A system-wide profiler for Linux
|
||||
systems that is capable of profiling all running code at low overhead.
|
||||
You can find out more about OProfile at
|
||||
<ulink url='http://oprofile.sourceforge.net/about/'></ulink>.</para></listitem>
|
||||
<listitem><para><emphasis>Perf:</emphasis> Performance counters for Linux used
|
||||
to keep track of certain types of hardware and software events.
|
||||
For more information on these types of counters see
|
||||
<ulink url='https://perf.wiki.kernel.org/'></ulink> and click
|
||||
on “Perf tools.”</para></listitem>
|
||||
<listitem><para><emphasis>SystemTap:</emphasis> A free software infrastructure
|
||||
that simplifies information gathering about a running Linux system.
|
||||
This information helps you diagnose performance or functional problems.
|
||||
SystemTap is not available as a user-space tool through the Eclipse IDE Yocto Plug-in.
|
||||
See <ulink url='http://sourceware.org/systemtap'></ulink> for more information
|
||||
on SystemTap.</para></listitem>
|
||||
<listitem><para><emphasis>Lttng-ust:</emphasis> A User-space Tracer designed to
|
||||
provide detailed information on user-space activity.
|
||||
See <ulink url='http://lttng.org/ust'></ulink> for more information on Lttng-ust.
|
||||
</para></listitem>
|
||||
</itemizedlist>
|
||||
|
||||
@@ -3,9 +3,6 @@
|
||||
|
||||
<xsl:import href="http://docbook.sourceforge.net/release/xsl/current/xhtml/docbook.xsl" />
|
||||
|
||||
<xsl:param name="html.stylesheet" select="'adt-style.css'" />
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="appendix.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
<!-- <xsl:param name="generate.toc" select="'article nop'"></xsl:param> -->
|
||||
|
||||
</xsl:stylesheet>
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
<?xml version='1.0'?>
|
||||
<xsl:stylesheet
|
||||
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
|
||||
xmlns="http://www.w3.org/1999/xhtml"
|
||||
xmlns:fo="http://www.w3.org/1999/XSL/Format"
|
||||
version="1.0">
|
||||
|
||||
<xsl:import
|
||||
href="http://docbook.sourceforge.net/release/xsl/current/eclipse/eclipse3.xsl" />
|
||||
|
||||
<xsl:param name="chunker.output.indent" select="'yes'"/>
|
||||
<xsl:param name="chunk.quietly" select="1"/>
|
||||
<xsl:param name="chunk.first.sections" select="1"/>
|
||||
<xsl:param name="chunk.section.depth" select="10"/>
|
||||
<xsl:param name="use.id.as.filename" select="1"/>
|
||||
<xsl:param name="ulink.target" select="'_self'" />
|
||||
<xsl:param name="base.dir" select="'html/adt-manual/'"/>
|
||||
<xsl:param name="html.stylesheet" select="'../book.css'"/>
|
||||
<xsl:param name="eclipse.manifest" select="0"/>
|
||||
<xsl:param name="create.plugin.xml" select="0"/>
|
||||
<xsl:param name="suppress.navigation" select="1"/>
|
||||
<xsl:param name="generate.index" select="0"/>
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="appendix.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
</xsl:stylesheet>
|
||||
@@ -2,7 +2,7 @@
|
||||
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
|
||||
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
|
||||
|
||||
<book id='adt-manual' lang='en'
|
||||
<book id='adt-manual' lang='en'
|
||||
xmlns:xi="http://www.w3.org/2003/XInclude"
|
||||
xmlns="http://docbook.org/ns/docbook"
|
||||
>
|
||||
@@ -10,15 +10,13 @@
|
||||
|
||||
<mediaobject>
|
||||
<imageobject>
|
||||
<imagedata fileref='figures/adt-title.png'
|
||||
format='SVG'
|
||||
<imagedata fileref='figures/adt-title.png'
|
||||
format='SVG'
|
||||
align='left' scalefit='1' width='100%'/>
|
||||
</imageobject>
|
||||
</imageobject>
|
||||
</mediaobject>
|
||||
|
||||
<title>
|
||||
Yocto Project Application Developer's Guide
|
||||
</title>
|
||||
<title></title>
|
||||
|
||||
<authorgroup>
|
||||
<author>
|
||||
@@ -56,37 +54,7 @@
|
||||
<date>October 2012</date>
|
||||
<revremark>Released with the Yocto Project 1.3 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4</revnumber>
|
||||
<date>April 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.1</revnumber>
|
||||
<date>June 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4.1 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.2</revnumber>
|
||||
<date>August 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4.2 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.3</revnumber>
|
||||
<date>March 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.3 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.4</revnumber>
|
||||
<date>May 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.4 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.5</revnumber>
|
||||
<date>July 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.5 Release.</revremark>
|
||||
</revision>
|
||||
</revhistory>
|
||||
</revhistory>
|
||||
|
||||
<copyright>
|
||||
<year>©RIGHT_YEAR;</year>
|
||||
@@ -95,12 +63,12 @@
|
||||
|
||||
<legalnotice>
|
||||
<para>
|
||||
Permission is granted to copy, distribute and/or modify this document under
|
||||
Permission is granted to copy, distribute and/or modify this document under
|
||||
the terms of the <ulink type="http" url="http://creativecommons.org/licenses/by-sa/2.0/uk/">Creative Commons Attribution-Share Alike 2.0 UK: England & Wales</ulink> as published by Creative Commons.
|
||||
</para>
|
||||
<note>
|
||||
Due to production processes, there could be differences between the Yocto Project
|
||||
documentation bundled in the release tarball and the
|
||||
documentation bundled in the release tarball and the
|
||||
<ulink url='&YOCTO_DOCS_ADT_URL;'>Yocto Project Application Developer's Guide</ulink> on
|
||||
the <ulink url='&YOCTO_HOME_URL;'>Yocto Project</ulink> website.
|
||||
For the latest version of this manual, see the manual on the website.
|
||||
@@ -124,6 +92,6 @@
|
||||
-->
|
||||
|
||||
</book>
|
||||
<!--
|
||||
vim: expandtab tw=80 ts=4
|
||||
<!--
|
||||
vim: expandtab tw=80 ts=4
|
||||
-->
|
||||
|
||||
@@ -6,10 +6,10 @@
|
||||
<title>Optionally Customizing the Development Packages Installation</title>
|
||||
|
||||
<para>
|
||||
Because the Yocto Project is suited for embedded Linux development, it is
|
||||
likely that you will need to customize your development packages installation.
|
||||
For example, if you are developing a minimal image, then you might not need
|
||||
certain packages (e.g. graphics support packages).
|
||||
Because the Yocto Project is suited for embedded Linux development, it is
|
||||
likely that you will need to customize your development packages installation.
|
||||
For example, if you are developing a minimal image, then you might not need
|
||||
certain packages (e.g. graphics support packages).
|
||||
Thus, you would like to be able to remove those packages from your target sysroot.
|
||||
</para>
|
||||
|
||||
@@ -17,24 +17,24 @@
|
||||
<title>Package Management Systems</title>
|
||||
|
||||
<para>
|
||||
The OpenEmbedded build system supports the generation of sysroot files using
|
||||
The OpenEmbedded build system supports the generation of sysroot files using
|
||||
three different Package Management Systems (PMS):
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis>OPKG:</emphasis> A less well known PMS whose use
|
||||
originated in the OpenEmbedded and OpenWrt embedded Linux projects.
|
||||
<listitem><para><emphasis>OPKG:</emphasis> A less well known PMS whose use
|
||||
originated in the OpenEmbedded and OpenWrt embedded Linux projects.
|
||||
This PMS works with files packaged in an <filename>.ipk</filename> format.
|
||||
See <ulink url='http://en.wikipedia.org/wiki/Opkg'></ulink> for more
|
||||
See <ulink url='http://en.wikipedia.org/wiki/Opkg'></ulink> for more
|
||||
information about OPKG.</para></listitem>
|
||||
<listitem><para><emphasis>RPM:</emphasis> A more widely known PMS intended for GNU/Linux
|
||||
distributions.
|
||||
<listitem><para><emphasis>RPM:</emphasis> A more widely known PMS intended for GNU/Linux
|
||||
distributions.
|
||||
This PMS works with files packaged in an <filename>.rms</filename> format.
|
||||
The build system currently installs through this PMS by default.
|
||||
The build system currently installs through this PMS by default.
|
||||
See <ulink url='http://en.wikipedia.org/wiki/RPM_Package_Manager'></ulink>
|
||||
for more information about RPM.</para></listitem>
|
||||
<listitem><para><emphasis>Debian:</emphasis> The PMS for Debian-based systems
|
||||
is built on many PMS tools.
|
||||
The lower-level PMS tool <filename>dpkg</filename> forms the base of the Debian PMS.
|
||||
For information on dpkg see
|
||||
<listitem><para><emphasis>Debian:</emphasis> The PMS for Debian-based systems
|
||||
is built on many PMS tools.
|
||||
The lower-level PMS tool <filename>dpkg</filename> forms the base of the Debian PMS.
|
||||
For information on dpkg see
|
||||
<ulink url='http://en.wikipedia.org/wiki/Dpkg'></ulink>.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
@@ -44,31 +44,31 @@
|
||||
<title>Configuring the PMS</title>
|
||||
|
||||
<para>
|
||||
Whichever PMS you are using, you need to be sure that the
|
||||
Whichever PMS you are using, you need to be sure that the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-PACKAGE_CLASSES'><filename>PACKAGE_CLASSES</filename></ulink>
|
||||
variable in the <filename>conf/local.conf</filename>
|
||||
file is set to reflect that system.
|
||||
file is set to reflect that system.
|
||||
The first value you choose for the variable specifies the package file format for the root
|
||||
filesystem at sysroot.
|
||||
Additional values specify additional formats for convenience or testing.
|
||||
Additional values specify additional formats for convenience or testing.
|
||||
See the configuration file for details.
|
||||
</para>
|
||||
|
||||
<note>
|
||||
For build performance information related to the PMS, see the
|
||||
"<ulink url='&YOCTO_DOCS_REF_URL;#ref-classes-package'>Packaging - <filename>package*.bbclass</filename></ulink>"
|
||||
section in the Yocto Project Reference Manual.
|
||||
For build performance information related to the PMS, see
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#ref-classes-package'>Packaging - <filename>package*.bbclass</filename></ulink>
|
||||
in the Yocto Project Reference Manual.
|
||||
</note>
|
||||
|
||||
<para>
|
||||
As an example, consider a scenario where you are using OPKG and you want to add
|
||||
As an example, consider a scenario where you are using OPKG and you want to add
|
||||
the <filename>libglade</filename> package to the target sysroot.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
First, you should generate the <filename>ipk</filename> file for the
|
||||
<filename>libglade</filename> package and add it
|
||||
into a working <filename>opkg</filename> repository.
|
||||
First, you should generate the <filename>ipk</filename> file for the
|
||||
<filename>libglade</filename> package and add it
|
||||
into a working <filename>opkg</filename> repository.
|
||||
Use these commands:
|
||||
<literallayout class='monospaced'>
|
||||
$ bitbake libglade
|
||||
@@ -77,12 +77,12 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Next, source the environment setup script found in the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink>.
|
||||
Follow that by setting up the installation destination to point to your
|
||||
sysroot as <filename><sysroot_dir></filename>.
|
||||
Next, source the environment setup script found in the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink>.
|
||||
Follow that by setting up the installation destination to point to your
|
||||
sysroot as <filename><sysroot_dir></filename>.
|
||||
Finally, have an OPKG configuration file <filename><conf_file></filename>
|
||||
that corresponds to the <filename>opkg</filename> repository you have just created.
|
||||
that corresponds to the <filename>opkg</filename> repository you have just created.
|
||||
The following command forms should now work:
|
||||
<literallayout class='monospaced'>
|
||||
$ opkg-cl –f <conf_file> -o <sysroot_dir> update
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
|
||||
<para>
|
||||
In order to develop applications, you need set up your host development system.
|
||||
Several ways exist that allow you to install cross-development tools, QEMU, the
|
||||
Several ways exist that allow you to install cross-development tools, QEMU, the
|
||||
Eclipse Yocto Plug-in, and other tools.
|
||||
This chapter describes how to prepare for application development.
|
||||
</para>
|
||||
@@ -18,43 +18,43 @@
|
||||
|
||||
<para>
|
||||
The following list describes installation methods that set up varying degrees of tool
|
||||
availability on your system.
|
||||
availabiltiy on your system.
|
||||
Regardless of the installation method you choose,
|
||||
you must <filename>source</filename> the cross-toolchain
|
||||
environment setup script before you use a toolchain.
|
||||
See the "<link linkend='setting-up-the-cross-development-environment'>Setting Up the
|
||||
See the "<link linkend='setting-up-the-cross-development-environment'>Setting Up the
|
||||
Cross-Development Environment</link>" section for more information.
|
||||
</para>
|
||||
|
||||
<note>
|
||||
<para>Avoid mixing installation methods when installing toolchains for different architectures.
|
||||
For example, avoid using the ADT Installer to install some toolchains and then hand-installing
|
||||
cross-development toolchains by running the toolchain installer for different architectures.
|
||||
cross-development toolchains by running the toolchain installer for different architectures.
|
||||
Mixing installation methods can result in situations where the ADT Installer becomes
|
||||
unreliable and might not install the toolchain.</para>
|
||||
<para>If you must mix installation methods, you might avoid problems by deleting
|
||||
<filename>/var/lib/opkg</filename>, thus purging the <filename>opkg</filename> package
|
||||
<para>If you must mix installation methods, you might avoid problems by deleting
|
||||
<filename>/var/lib/opkg</filename>, thus purging the <filename>opkg</filename> package
|
||||
metadata</para>
|
||||
</note>
|
||||
|
||||
|
||||
<para>
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis>Use the ADT installer script:</emphasis>
|
||||
<listitem><para><emphasis>Use the ADT Installer Script:</emphasis>
|
||||
This method is the recommended way to install the ADT because it
|
||||
automates much of the process for you.
|
||||
For example, you can configure the installation to install the QEMU emulator
|
||||
and the user-space NFS, specify which root filesystem profiles to download,
|
||||
and the user-space NFS, specify which root filesystem profiles to download,
|
||||
and define the target sysroot location.</para></listitem>
|
||||
<listitem><para><emphasis>Use an existing toolchain:</emphasis>
|
||||
<listitem><para><emphasis>Use an Existing Toolchain:</emphasis>
|
||||
Using this method, you select and download an architecture-specific
|
||||
toolchain installer and then run the script to hand-install the toolchain.
|
||||
If you use this method, you just get the cross-toolchain and QEMU - you do not
|
||||
If you use this method, you just get the cross-toolchain and QEMU - you do not
|
||||
get any of the other mentioned benefits had you run the ADT Installer script.</para></listitem>
|
||||
<listitem><para><emphasis>Use the toolchain from within the Build Directory:</emphasis>
|
||||
If you already have a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>,
|
||||
<listitem><para><emphasis>Use the Toolchain from within the Build Directory:</emphasis>
|
||||
If you already have a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>,
|
||||
you can build the cross-toolchain within the directory.
|
||||
However, like the previous method mentioned, you only get the cross-toolchain and QEMU - you
|
||||
However, like the previous method mentioned, you only get the cross-toolchain and QEMU - you
|
||||
do not get any of the other benefits without taking separate steps.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
@@ -63,16 +63,8 @@
|
||||
<title>Using the ADT Installer</title>
|
||||
|
||||
<para>
|
||||
To run the ADT Installer, you need to get the ADT Installer tarball, be sure
|
||||
you have the necessary host development packages that support the ADT Installer,
|
||||
and then run the ADT Installer Script.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
For a list of the host packages needed to support ADT installation and use, see the
|
||||
"ADT Installer Extras" lists in the
|
||||
"<ulink url='&YOCTO_DOCS_REF_URL;#required-packages-for-the-host-development-system'>Required Packages for the Host Development System</ulink>" section
|
||||
of the Yocto Project Reference Manual.
|
||||
To run the ADT Installer, you need to first get the ADT Installer tarball and then run the ADT
|
||||
Installer Script.
|
||||
</para>
|
||||
|
||||
<section id='getting-the-adt-installer-tarball'>
|
||||
@@ -80,27 +72,27 @@
|
||||
|
||||
<para>
|
||||
The ADT Installer is contained in the ADT Installer tarball.
|
||||
You can download the tarball into any directory from the
|
||||
You can download the tarball into any directory from the
|
||||
<ulink url='&YOCTO_DL_URL;/releases'>Index of Releases</ulink>, specifically
|
||||
at
|
||||
at
|
||||
<ulink url='&YOCTO_ADTINSTALLER_DL_URL;'></ulink>.
|
||||
Or, you can use BitBake to generate the tarball inside the existing
|
||||
Or, you can use BitBake to generate the tarball inside the existing
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
If you use BitBake to generate the ADT Installer tarball, you must
|
||||
<filename>source</filename> the environment setup script
|
||||
(<ulink url='&YOCTO_DOCS_REF_URL;#structure-core-script'><filename>&OE_INIT_FILE;</filename></ulink>)
|
||||
located in the Source Directory before running the
|
||||
BitBake command that creates the tarball.
|
||||
If you use BitBake to generate the ADT Installer tarball, you must
|
||||
<filename>source</filename> the environment setup script
|
||||
(<filename>&OE_INIT_FILE;</filename>) located
|
||||
in the Source Directory before running the <filename>bitbake</filename>
|
||||
command that creates the tarball.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The following example commands download the Poky tarball, set up the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink>,
|
||||
set up the environment while also creating the default Build Directory,
|
||||
and run the BitBake command that results in the tarball
|
||||
The following example commands download the Poky tarball, set up the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink>,
|
||||
set up the environment while also creating the default Build Directory,
|
||||
and run the <filename>bitbake</filename> command that results in the tarball
|
||||
<filename>~/yocto-project/build/tmp/deploy/sdk/adt_installer.tar.bz2</filename>:
|
||||
<literallayout class='monospaced'>
|
||||
$ cd ~
|
||||
@@ -120,107 +112,112 @@
|
||||
<para>
|
||||
Before running the ADT Installer script, you need to unpack the tarball.
|
||||
You can unpack the tarball in any directory you wish.
|
||||
For example, this command copies the ADT Installer tarball from where
|
||||
it was built into the home directory and then unpacks the tarball into
|
||||
For example, this command copies the ADT Installer tarball from where
|
||||
it was built into the home directory and then unpacks the tarball into
|
||||
a top-level directory named <filename>adt-installer</filename>:
|
||||
<literallayout class='monospaced'>
|
||||
$ cd ~
|
||||
$ cp ~/poky/build/tmp/deploy/sdk/adt_installer.tar.bz2 $HOME
|
||||
$ tar -xjf adt_installer.tar.bz2
|
||||
</literallayout>
|
||||
Unpacking it creates the directory <filename>adt-installer</filename>,
|
||||
Unpacking it creates the directory <filename>adt-installer</filename>,
|
||||
which contains the ADT Installer script (<filename>adt_installer</filename>)
|
||||
and its configuration file (<filename>adt_installer.conf</filename>).
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Before you run the script, however, you should examine the ADT Installer configuration
|
||||
file and be sure you are going to get what you want.
|
||||
Before you run the script, however, you should examine the ADT Installer configuration
|
||||
file and be sure you are going to get what you want.
|
||||
Your configurations determine which kernel and filesystem image are downloaded.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The following list describes the configurations you can define for the ADT Installer.
|
||||
For configuration values and restrictions, see the comments in
|
||||
<para>
|
||||
The following list describes the configurations you can define for the ADT Installer.
|
||||
For configuration values and restrictions, see the comments in
|
||||
the <filename>adt-installer.conf</filename> file:
|
||||
|
||||
<itemizedlist>
|
||||
<listitem><para><filename>YOCTOADT_REPO</filename>: This area
|
||||
includes the IPKG-based packages and the root filesystem upon which
|
||||
the installation is based.
|
||||
If you want to set up your own IPKG repository pointed to by
|
||||
<filename>YOCTOADT_REPO</filename>, you need to be sure that the
|
||||
directory structure follows the same layout as the reference directory
|
||||
set up at <ulink url='http://adtrepo.yoctoproject.org'></ulink>.
|
||||
<listitem><para><filename>YOCTOADT_REPO</filename>: This area
|
||||
includes the IPKG-based packages and the root filesystem upon which
|
||||
the installation is based.
|
||||
If you want to set up your own IPKG repository pointed to by
|
||||
<filename>YOCTOADT_REPO</filename>, you need to be sure that the
|
||||
directory structure follows the same layout as the reference directory
|
||||
set up at <ulink url='http://adtrepo.yoctoproject.org'></ulink>.
|
||||
Also, your repository needs to be accessible through HTTP.</para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_TARGETS</filename>: The machine
|
||||
target architectures for which you want to set up cross-development
|
||||
<listitem><para><filename>YOCTOADT_TARGETS</filename>: The machine
|
||||
target architectures for which you want to set up cross-development
|
||||
environments.</para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_QEMU</filename>: Indicates whether
|
||||
<listitem><para><filename>YOCTOADT_QEMU</filename>: Indicates whether
|
||||
or not to install the emulator QEMU.</para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_NFS_UTIL</filename>: Indicates whether
|
||||
or not to install user-mode NFS.
|
||||
If you plan to use the Eclipse IDE Yocto plug-in against QEMU,
|
||||
<listitem><para><filename>YOCTOADT_NFS_UTIL</filename>: Indicates whether
|
||||
or not to install user-mode NFS.
|
||||
If you plan to use the Eclipse IDE Yocto plug-in against QEMU,
|
||||
you should install NFS.
|
||||
<note>To boot QEMU images using our userspace NFS server, you need
|
||||
to be running <filename>portmap</filename> or <filename>rpcbind</filename>.
|
||||
If you are running <filename>rpcbind</filename>, you will also need to add the
|
||||
<filename>-i</filename> option when <filename>rpcbind</filename> starts up.
|
||||
Please make sure you understand the security implications of doing this.
|
||||
You might also have to modify your firewall settings to allow
|
||||
<note>To boot QEMU images using our userspace NFS server, you need
|
||||
to be running <filename>portmap</filename> or <filename>rpcbind</filename>.
|
||||
If you are running <filename>rpcbind</filename>, you will also need to add the
|
||||
<filename>-i</filename> option when <filename>rpcbind</filename> starts up.
|
||||
Please make sure you understand the security implications of doing this.
|
||||
You might also have to modify your firewall settings to allow
|
||||
NFS booting to work.</note></para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_ROOTFS_<arch></filename>: The root
|
||||
filesystem images you want to download from the
|
||||
<listitem><para><filename>YOCTOADT_ROOTFS_<arch></filename>: The root
|
||||
filesystem images you want to download from the
|
||||
<filename>YOCTOADT_IPKG_REPO</filename> repository.</para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_TARGET_SYSROOT_IMAGE_<arch></filename>: The
|
||||
<listitem><para><filename>YOCTOADT_TARGET_SYSROOT_IMAGE_<arch></filename>: The
|
||||
particular root filesystem used to extract and create the target sysroot.
|
||||
The value of this variable must have been specified with
|
||||
The value of this variable must have been specified with
|
||||
<filename>YOCTOADT_ROOTFS_<arch></filename>.
|
||||
For example, if you downloaded both <filename>minimal</filename> and
|
||||
<filename>sato-sdk</filename> images by setting
|
||||
For example, if you downloaded both <filename>minimal</filename> and
|
||||
<filename>sato-sdk</filename> images by setting
|
||||
<filename>YOCTOADT_ROOTFS_<arch></filename>
|
||||
to "minimal sato-sdk", then <filename>YOCTOADT_ROOTFS_<arch></filename>
|
||||
must be set to either <filename>minimal</filename> or
|
||||
must be set to either <filename>minimal</filename> or
|
||||
<filename>sato-sdk</filename>.</para></listitem>
|
||||
<listitem><para><filename>YOCTOADT_TARGET_SYSROOT_LOC_<arch></filename>: The
|
||||
<listitem><para><filename>YOCTOADT_TARGET_SYSROOT_LOC_<arch></filename>: The
|
||||
location on the development host where the target sysroot is created.
|
||||
</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
|
||||
<para>
|
||||
After you have configured the <filename>adt_installer.conf</filename> file,
|
||||
After you have configured the <filename>adt_installer.conf</filename> file,
|
||||
run the installer using the following command.
|
||||
Be sure that you are not trying to use cross-compilation tools.
|
||||
When you run the installer, the environment must use a
|
||||
Be sure that you are not trying to use cross-compilation tools.
|
||||
When you run the installer, the environment must use a
|
||||
host <filename>gcc</filename>:
|
||||
<literallayout class='monospaced'>
|
||||
$ cd ~/adt-installer
|
||||
$ ./adt_installer
|
||||
</literallayout>
|
||||
Once the installer begins to run, you are asked to enter the
|
||||
location for cross-toolchain installation.
|
||||
The default location is
|
||||
<filename>/opt/poky/<release></filename>.
|
||||
After either accepting the default location or selecting your
|
||||
own location, you are prompted to run the installation script
|
||||
interactively or in silent mode.
|
||||
If you want to closely monitor the installation,
|
||||
choose “I” for interactive mode rather than “S” for silent mode.
|
||||
</para>
|
||||
|
||||
<note>
|
||||
The ADT Installer requires the <filename>libtool</filename> package to complete.
|
||||
If you install the recommended packages as described in
|
||||
"<ulink url='&YOCTO_DOCS_QS_URL;#packages'>The Packages</ulink>"
|
||||
section of the Yocto Project Quick Start, then you will have libtool installed.
|
||||
</note>
|
||||
|
||||
<para>
|
||||
Once the installer begins to run, you are asked to enter the location for
|
||||
cross-toolchain installation.
|
||||
The default location is <filename>/opt/poky/<release></filename>.
|
||||
After selecting the location, you are prompted to run in
|
||||
interactive or silent mode.
|
||||
If you want to closely monitor the installation, choose “I” for interactive
|
||||
mode rather than “S” for silent mode.
|
||||
Follow the prompts from the script to complete the installation.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Once the installation completes, the ADT, which includes the
|
||||
cross-toolchain, is installed in the selected installation
|
||||
directory.
|
||||
You will notice environment setup files for the cross-toolchain
|
||||
in the installation directory, and image tarballs in the
|
||||
<filename>adt-installer</filename> directory according to your
|
||||
installer configurations, and the target sysroot located
|
||||
according to the
|
||||
<filename>YOCTOADT_TARGET_SYSROOT_LOC_<arch></filename>
|
||||
variable also in your configuration file.
|
||||
Once the installation completes, the ADT, which includes the cross-toolchain, is installed.
|
||||
You will notice environment setup files for the cross-toolchain in
|
||||
<filename>&YOCTO_ADTPATH_DIR;</filename>,
|
||||
and image tarballs in the <filename>adt-installer</filename>
|
||||
directory according to your installer configurations, and the target sysroot located
|
||||
according to the <filename>YOCTOADT_TARGET_SYSROOT_LOC_<arch></filename> variable
|
||||
also in your configuration file.
|
||||
</para>
|
||||
</section>
|
||||
</section>
|
||||
@@ -229,87 +226,57 @@
|
||||
<title>Using a Cross-Toolchain Tarball</title>
|
||||
|
||||
<para>
|
||||
If you want to simply install the cross-toolchain by hand, you can
|
||||
do so by running the toolchain installer.
|
||||
If you use this method to install the cross-toolchain and you
|
||||
might still need to install the target sysroot by installing and
|
||||
extracting it separately.
|
||||
For information on how to install the sysroot, see the
|
||||
If you want to simply install the cross-toolchain by hand, you can do so by running the
|
||||
toolchain installer.
|
||||
If you use this method to install the cross-toolchain and you still need to install the target
|
||||
sysroot, you will have to extract and install sysroot separately.
|
||||
For information on how to do this, see the
|
||||
"<link linkend='extracting-the-root-filesystem'>Extracting the Root Filesystem</link>" section.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Follow these steps:
|
||||
<orderedlist>
|
||||
<listitem><para>Go to
|
||||
<ulink url='&YOCTO_TOOLCHAIN_DL_URL;'></ulink>
|
||||
and find the folder that matches your host development system
|
||||
(i.e. <filename>i686</filename> for 32-bit machines or
|
||||
<listitem><para>Go to
|
||||
<ulink url='&YOCTO_TOOLCHAIN_DL_URL;'></ulink>
|
||||
and find the folder that matches your host development system
|
||||
(i.e. <filename>i686</filename> for 32-bit machines or
|
||||
<filename>x86-64</filename> for 64-bit machines).</para></listitem>
|
||||
<listitem><para>Go into that folder and download the toolchain installer whose name
|
||||
<listitem><para>Go into that folder and download the toolchain installer whose name
|
||||
includes the appropriate target architecture.
|
||||
For example, if your host development system is an Intel-based 64-bit system and
|
||||
you are going to use your cross-toolchain for an Intel-based 32-bit target, go into the
|
||||
For example, if your host development system is an Intel-based 64-bit system and
|
||||
you are going to use your cross-toolchain for an Intel-based 32-bit target, go into the
|
||||
<filename>x86_64</filename> folder and download the following installer:
|
||||
<literallayout class='monospaced'>
|
||||
poky-eglibc-x86_64-i586-toolchain-gmae-&DISTRO;.sh
|
||||
</literallayout>
|
||||
<note><para>As an alternative to steps one and two, you can
|
||||
build the toolchain installer if you have a
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>.
|
||||
If you need GMAE, you should use the
|
||||
<filename>bitbake meta-toolchain-gmae</filename>
|
||||
command.
|
||||
Running the resulting installation script will support
|
||||
such development.
|
||||
If you are not concerned with GMAE, you can generate
|
||||
the toolchain installer using
|
||||
<filename>bitbake meta-toolchain</filename>.
|
||||
Either of these methods requires you to still
|
||||
install the target sysroot by installing and
|
||||
extracting it separately.
|
||||
For information on how to install the sysroot, see the
|
||||
"<link linkend='extracting-the-root-filesystem'>Extracting the Root Filesystem</link>" section.
|
||||
</para>
|
||||
<para>A final method of building the toolchain installer
|
||||
exists that has significant advantages over the previous
|
||||
two methods.
|
||||
This method results in a toolchain installer that
|
||||
contains the sysroot that matches your target root
|
||||
filesystem.
|
||||
To build this installer, use the
|
||||
<filename>bitbake image -c populate_sdk</filename>
|
||||
command.</para>
|
||||
<para>Remember, before using any
|
||||
<filename>bitbake</filename> command, you must source
|
||||
the <filename>&OE_INIT_PATH;</filename> script
|
||||
located in the Source Directory and you must make sure
|
||||
your <filename>conf/local.conf</filename> variables are
|
||||
correct.
|
||||
In particular, you need to be sure the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-MACHINE'><filename>MACHINE</filename></ulink>
|
||||
variable matches the architecture for which you are
|
||||
building and that the <filename>SDKMACHINE</filename>
|
||||
variable is correctly set if you are building
|
||||
a toolchain for an architecture that differs from your
|
||||
current development host machine.</para>
|
||||
<para>When the BitBake command
|
||||
completes, the toolchain installer will be in
|
||||
<filename>tmp/deploy/sdk</filename> in the Build
|
||||
Directory.</para>
|
||||
</note></para></listitem>
|
||||
<note><para>As an alternative to steps one and two, you can build the toolchain installer
|
||||
if you have a <ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>.
|
||||
If you need GMAE, you should use the <filename>bitbake meta-toolchain-gmae</filename>
|
||||
command.
|
||||
The resulting installation script when run will support such development.
|
||||
However, if you are not concerned with GMAE,
|
||||
you can generate the toolchain installer using
|
||||
<filename>bitbake meta-toolchain</filename>.</para>
|
||||
<para>Use the appropriate <filename>bitbake</filename> command only after you have
|
||||
sourced the <filename>&OE_INIT_PATH;</filename> script located in the Source
|
||||
Directory.
|
||||
When the <filename>bitbake</filename> command completes, the toolchain installer will
|
||||
be in <filename>tmp/deploy/sdk</filename> in the Build Directory.
|
||||
</para></note>
|
||||
</para></listitem>
|
||||
<listitem><para>Once you have the installer, run it to install the toolchain.
|
||||
You must change the permissions on the toolchain installer
|
||||
You must change the permissions on the toolchain installer
|
||||
script so that it is executable.</para>
|
||||
<para>The following command shows how to run the installer given a toolchain tarball
|
||||
<para>The following command shows how to run the installer given a toolchain tarball
|
||||
for a 64-bit development host system and a 32-bit target architecture.
|
||||
The example assumes the toolchain installer is located in <filename>~/Downloads/</filename>.
|
||||
<literallayout class='monospaced'>
|
||||
$ ~/Downloads/poky-eglibc-x86_64-i586-toolchain-gmae-&DISTRO;.sh
|
||||
</literallayout>
|
||||
<note>
|
||||
If you do not have write permissions for the directory into which you are installing
|
||||
the toolchain, the toolchain installer notifies you and exits.
|
||||
If you do not have write permissions for the directory into which you are installing
|
||||
the toolchain, the toolchain installer notifies you and exits.
|
||||
Be sure you have write permissions in the directory and run the installer again.
|
||||
</note>
|
||||
Once the tarball is expanded, the cross-toolchain is installed.
|
||||
@@ -323,50 +290,50 @@
|
||||
<title>Using BitBake and the Build Directory</title>
|
||||
|
||||
<para>
|
||||
A final way of making the cross-toolchain available is to use BitBake
|
||||
to generate the toolchain within an existing
|
||||
A final way of making the cross-toolchain available is to use BitBake
|
||||
to generate the toolchain within an existing
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>.
|
||||
This method does not install the toolchain into the default
|
||||
This method does not install the toolchain into the
|
||||
<filename>/opt</filename> directory.
|
||||
As with the previous method, if you need to install the target sysroot, you must
|
||||
As with the previous method, if you need to install the target sysroot, you must
|
||||
do that separately as well.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Follow these steps to generate the toolchain into the Build Directory:
|
||||
<orderedlist>
|
||||
<listitem><para>Source the environment setup script
|
||||
<filename>&OE_INIT_FILE;</filename> located in the
|
||||
<listitem><para>Source the environment setup script
|
||||
<filename>&OE_INIT_FILE;</filename> located in the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#source-directory'>Source Directory</ulink>.
|
||||
</para></listitem>
|
||||
<listitem><para>At this point, you should be sure that the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-MACHINE'><filename>MACHINE</filename></ulink> variable
|
||||
in the <filename>local.conf</filename> file found in the
|
||||
</para></listitem>
|
||||
<listitem><para>At this point, you should be sure that the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-MACHINE'><filename>MACHINE</filename></ulink> variable
|
||||
in the <filename>local.conf</filename> file found in the
|
||||
<filename>conf</filename> directory of the Build Directory
|
||||
is set for the target architecture.
|
||||
Comments within the <filename>local.conf</filename> file list the values you
|
||||
can use for the <filename>MACHINE</filename> variable.
|
||||
<note>You can populate the Build Directory with the cross-toolchains for more
|
||||
than a single architecture.
|
||||
You just need to edit the <filename>MACHINE</filename> variable in the
|
||||
<filename>local.conf</filename> file and re-run the BitBake
|
||||
Comments within the <filename>local.conf</filename> file list the values you
|
||||
can use for the <filename>MACHINE</filename> variable.
|
||||
<note>You can populate the Build Directory with the cross-toolchains for more
|
||||
than a single architecture.
|
||||
You just need to edit the <filename>MACHINE</filename> variable in the
|
||||
<filename>local.conf</filename> file and re-run the BitBake
|
||||
command.</note></para></listitem>
|
||||
<listitem><para>Run <filename>bitbake meta-ide-support</filename> to complete the
|
||||
<listitem><para>Run <filename>bitbake meta-ide-support</filename> to complete the
|
||||
cross-toolchain generation.
|
||||
<note>If you change out of your working directory after you
|
||||
<note>If you change out of your working directory after you
|
||||
<filename>source</filename> the environment setup script and before you run
|
||||
the BitBake command, the command might not work.
|
||||
Be sure to run the BitBake command immediately
|
||||
after checking or editing the <filename>local.conf</filename> but without
|
||||
the <filename>bitbake</filename> command, the command might not work.
|
||||
Be sure to run the <filename>bitbake</filename> command immediately
|
||||
after checking or editing the <filename>local.conf</filename> but without
|
||||
changing out of your working directory.</note>
|
||||
Once the BitBake command finishes,
|
||||
Once the <filename>bitbake</filename> command finishes,
|
||||
the cross-toolchain is generated and populated within the Build Directory.
|
||||
You will notice environment setup files for the cross-toolchain in the
|
||||
You will notice environment setup files for the cross-toolchain in the
|
||||
Build Directory in the <filename>tmp</filename> directory.
|
||||
Setup script filenames contain the strings <filename>environment-setup</filename>.</para>
|
||||
<para>Be aware that when you use this method to install the toolchain you still need
|
||||
to separately extract and install the sysroot filesystem.
|
||||
For information on how to do this, see the
|
||||
For information on how to do this, see the
|
||||
"<link linkend='extracting-the-root-filesystem'>Extracting the Root Filesystem</link>" section.
|
||||
</para></listitem>
|
||||
</orderedlist>
|
||||
@@ -378,26 +345,24 @@
|
||||
<title>Setting Up the Cross-Development Environment</title>
|
||||
|
||||
<para>
|
||||
Before you can develop using the cross-toolchain, you need to set up the
|
||||
cross-development environment by sourcing the toolchain's environment setup script.
|
||||
Before you can develop using the cross-toolchain, you need to set up the
|
||||
cross-development environment by sourcing the toolchain's environment setup script.
|
||||
If you used the ADT Installer or hand-installed cross-toolchain,
|
||||
then you can find this script in the directory you chose for installation.
|
||||
The default installation directory is the <filename>&YOCTO_ADTPATH_DIR;</filename>
|
||||
directory.
|
||||
If you installed the toolchain in the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>,
|
||||
you can find the environment setup
|
||||
then you can find this script in the <filename>&YOCTO_ADTPATH_DIR;</filename>
|
||||
directory.
|
||||
If you installed the toolchain in the
|
||||
<ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>,
|
||||
you can find the environment setup
|
||||
script for the toolchain in the Build Directory's <filename>tmp</filename> directory.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Be sure to run the environment setup script that matches the architecture for
|
||||
which you are developing.
|
||||
Environment setup scripts begin with the string "<filename>environment-setup</filename>"
|
||||
and include as part of their name the architecture.
|
||||
For example, the toolchain environment setup script for a 64-bit
|
||||
IA-based architecture installed in the default installation directory
|
||||
would be the following:
|
||||
<para>
|
||||
Be sure to run the environment setup script that matches the architecture for
|
||||
which you are developing.
|
||||
Environment setup scripts begin with the string “<filename>environment-setup</filename>”
|
||||
and include as part of their name the architecture.
|
||||
For example, the toolchain environment setup script for a 64-bit IA-based architecture would
|
||||
be the following:
|
||||
<literallayout class='monospaced'>
|
||||
&YOCTO_ADTPATH_DIR;/environment-setup-x86_64-poky-linux
|
||||
</literallayout>
|
||||
@@ -408,7 +373,7 @@
|
||||
<title>Securing Kernel and Filesystem Images</title>
|
||||
|
||||
<para>
|
||||
You will need to have a kernel and filesystem image to boot using your
|
||||
You will need to have a kernel and filesystem image to boot using your
|
||||
hardware or the QEMU emulator.
|
||||
Furthermore, if you plan on booting your image using NFS or you want to use the root filesystem
|
||||
as the target sysroot, you need to extract the root filesystem.
|
||||
@@ -420,62 +385,62 @@
|
||||
<para>
|
||||
To get the kernel and filesystem images, you either have to build them or download
|
||||
pre-built versions.
|
||||
You can find examples for both these situations in the
|
||||
"<ulink url='&YOCTO_DOCS_QS_URL;#test-run'>A Quick Test Run</ulink>" section of
|
||||
You can find examples for both these situations in the
|
||||
"<ulink url='&YOCTO_DOCS_QS_URL;#test-run'>A Quick Test Run</ulink>" section of
|
||||
the Yocto Project Quick Start.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The Yocto Project ships basic kernel and filesystem images for several
|
||||
architectures (<filename>x86</filename>, <filename>x86-64</filename>,
|
||||
<filename>mips</filename>, <filename>powerpc</filename>, and <filename>arm</filename>)
|
||||
that you can use unaltered in the QEMU emulator.
|
||||
These kernel images reside in the release
|
||||
<para>
|
||||
The Yocto Project ships basic kernel and filesystem images for several
|
||||
architectures (<filename>x86</filename>, <filename>x86-64</filename>,
|
||||
<filename>mips</filename>, <filename>powerpc</filename>, and <filename>arm</filename>)
|
||||
that you can use unaltered in the QEMU emulator.
|
||||
These kernel images reside in the release
|
||||
area - <ulink url='&YOCTO_MACHINES_DL_URL;'></ulink>
|
||||
and are ideal for experimentation using Yocto Project.
|
||||
For information on the image types you can build using the OpenEmbedded build system,
|
||||
For information on the image types you can build using the OpenEmbedded build system,
|
||||
see the
|
||||
"<ulink url='&YOCTO_DOCS_REF_URL;#ref-images'>Images</ulink>" chapter in
|
||||
"<ulink url='&YOCTO_DOCS_REF_URL;#ref-images'>Images</ulink>" chapter in
|
||||
the Yocto Project Reference Manual.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
If you are planning on developing against your image and you are not
|
||||
building or using one of the Yocto Project development images
|
||||
(e.g. <filename>core-image-*-dev</filename>), you must be sure to
|
||||
include the development packages as part of your image recipe.
|
||||
</para>
|
||||
building or using one of the Yocto Project development images
|
||||
(e.g. core-image-*-dev), you must be sure to include the development
|
||||
packages as part of your image recipe.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Furthermore, if you plan on remotely deploying and debugging your
|
||||
application from within the
|
||||
Furthermore, if you plan on remotely deploying and debugging your
|
||||
application from within the
|
||||
Eclipse IDE, you must have an image that contains the Yocto Target Communication
|
||||
Framework (TCF) agent (<filename>tcf-agent</filename>).
|
||||
By default, the Yocto Project provides only one type pre-built image that contains the
|
||||
Framework (TCF) agent (<filename>tcf-agent</filename>).
|
||||
By default, the Yocto Project provides only one type pre-built image that contains the
|
||||
<filename>tcf-agent</filename>.
|
||||
And, those images are SDK (e.g.<filename>core-image-sato-sdk</filename>).
|
||||
</para>
|
||||
|
||||
<para>
|
||||
If you want to use a different image type that contains the <filename>tcf-agent</filename>,
|
||||
If you want to use a different image type that contains the <filename>tcf-agent</filename>,
|
||||
you can do so one of two ways:
|
||||
<itemizedlist>
|
||||
<listitem><para>Modify the <filename>conf/local.conf</filename> configuration in
|
||||
<listitem><para>Modify the <filename>conf/local.conf</filename> configuration in
|
||||
the <ulink url='&YOCTO_DOCS_DEV_URL;#build-directory'>Build Directory</ulink>
|
||||
and then rebuild the image.
|
||||
With this method, you need to modify the
|
||||
With this method, you need to modify the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-EXTRA_IMAGE_FEATURES'><filename>EXTRA_IMAGE_FEATURES</filename></ulink>
|
||||
variable to have the value of "tools-debug" before rebuilding the image.
|
||||
variable to have the value of "tools-debug" before rebuilding the image.
|
||||
Once the image is rebuilt, the <filename>tcf-agent</filename> will be included
|
||||
in the image and is launched automatically after the boot.</para></listitem>
|
||||
<listitem><para>Manually build the <filename>tcf-agent</filename>.
|
||||
To build the agent, follow these steps:
|
||||
<orderedlist>
|
||||
<listitem><para>Be sure the ADT is installed as described in the
|
||||
<listitem><para>Be sure the ADT is installed as described in the
|
||||
"<link linkend='installing-the-adt'>Installing the ADT and Toolchains</link>" section.
|
||||
</para></listitem>
|
||||
<listitem><para>Set up the cross-development environment as described in the
|
||||
"<link linkend='setting-up-the-cross-development-environment'>Setting
|
||||
<listitem><para>Set up the cross-development environment as described in the
|
||||
"<link linkend='setting-up-the-cross-development-environment'>Setting
|
||||
Up the Cross-Development Environment</link>" section.</para></listitem>
|
||||
<listitem><para>Get the <filename>tcf-agent</filename> source code using
|
||||
the following commands:
|
||||
@@ -484,17 +449,17 @@
|
||||
$ cd agent
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para>Modify the <filename>Makefile.inc</filename> file
|
||||
for the cross-compilation environment by setting the
|
||||
<filename>OPSYS</filename> and
|
||||
for the cross-compilation environment by setting the
|
||||
<filename>OPSYS</filename> and
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;#var-MACHINE'><filename>MACHINE</filename></ulink>
|
||||
variables according to your target.</para></listitem>
|
||||
<listitem><para>Use the cross-development tools to build the
|
||||
<filename>tcf-agent</filename>.
|
||||
<listitem><para>Use the cross-development tools to build the
|
||||
<filename>tcf-agent</filename>.
|
||||
Before you "Make" the file, be sure your cross-tools are set up first.
|
||||
See the "<link linkend='makefile-based-projects'>Makefile-Based Projects</link>"
|
||||
section for information on how to make sure the cross-tools are set up
|
||||
correctly.</para>
|
||||
<para>If the build is successful, the <filename>tcf-agent</filename> output will
|
||||
<para>If the build is successful, the <filename>tcf-agent</filename> output will
|
||||
be <filename>obj/$(OPSYS)/$(MACHINE)/Debug/agent</filename>.</para></listitem>
|
||||
<listitem><para>Deploy the agent into the image's root filesystem.</para></listitem>
|
||||
</orderedlist>
|
||||
@@ -509,29 +474,29 @@
|
||||
<para>
|
||||
You must extract the root filesystem if you want to boot the image using NFS
|
||||
or you want to use the root filesystem as the target sysroot.
|
||||
For example, the Eclipse IDE environment with the Eclipse Yocto Plug-in installed allows you
|
||||
For example, the Eclipse IDE environment with the Eclipse Yocto Plug-in installed allows you
|
||||
to use QEMU to boot under NFS.
|
||||
Another example is if you want to develop your target application using the
|
||||
root filesystem as the target sysroot.
|
||||
root filesystem as the target sysroot.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
<para>
|
||||
To extract the root filesystem, first <filename>source</filename>
|
||||
the cross-development environment setup script and then
|
||||
use the <filename>runqemu-extract-sdk</filename> command on the
|
||||
filesystem image.
|
||||
the cross-development environment setup script and then
|
||||
use the <filename>runqemu-extract-sdk</filename> command on the
|
||||
filesystem image.
|
||||
For example, the following commands set up the environment and then extract
|
||||
the root filesystem from a previously built filesystem image tarball named
|
||||
the root filesystem from a previously built filesystem image tarball named
|
||||
<filename>core-image-sato-sdk-qemux86-2011091411831.rootfs.tar.bz2</filename>.
|
||||
The example extracts the root filesystem into the <filename>$HOME/qemux86-sato</filename>
|
||||
directory:
|
||||
<literallayout class='monospaced'>
|
||||
$ source $HOME/toolchain_dir/environment-setup-i586-poky-linux
|
||||
$ source $HOME/poky/build/tmp/environment-setup-i586-poky-linux
|
||||
$ runqemu-extract-sdk \
|
||||
~Downloads/core-image-sato-sdk-qemux86-2011091411831.rootfs.tar.bz2 \
|
||||
tmp/deploy/images/core-image-sato-sdk-qemux86-2011091411831.rootfs.tar.bz2 \
|
||||
$HOME/qemux86-sato
|
||||
</literallayout>
|
||||
In this case, you could now point to the target sysroot at
|
||||
In this case, you could now point to the target sysroot at
|
||||
<filename>$HOME/qemux86-sato</filename>.
|
||||
</para>
|
||||
</section>
|
||||
|
||||
@@ -3,8 +3,4 @@
|
||||
|
||||
<xsl:import href="http://docbook.sourceforge.net/release/xsl/current/xhtml/docbook.xsl" />
|
||||
|
||||
<xsl:param name="html.stylesheet" select="'bsp-style.css'" />
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
</xsl:stylesheet>
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
<?xml version='1.0'?>
|
||||
<xsl:stylesheet
|
||||
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
|
||||
xmlns="http://www.w3.org/1999/xhtml"
|
||||
xmlns:fo="http://www.w3.org/1999/XSL/Format"
|
||||
version="1.0">
|
||||
|
||||
<xsl:import
|
||||
href="http://docbook.sourceforge.net/release/xsl/current/eclipse/eclipse3.xsl" />
|
||||
|
||||
<xsl:param name="chunker.output.indent" select="'yes'"/>
|
||||
<xsl:param name="chunk.quietly" select="1"/>
|
||||
<xsl:param name="chunk.first.sections" select="1"/>
|
||||
<xsl:param name="chunk.section.depth" select="10"/>
|
||||
<xsl:param name="use.id.as.filename" select="1"/>
|
||||
<xsl:param name="ulink.target" select="'_self'" />
|
||||
<xsl:param name="base.dir" select="'html/bsp-guide/'"/>
|
||||
<xsl:param name="html.stylesheet" select="'../book.css'"/>
|
||||
<xsl:param name="eclipse.manifest" select="0"/>
|
||||
<xsl:param name="create.plugin.xml" select="0"/>
|
||||
<xsl:param name="suppress.navigation" select="1"/>
|
||||
<xsl:param name="generate.index" select="0"/>
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="appendix.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
</xsl:stylesheet>
|
||||
@@ -2,7 +2,7 @@
|
||||
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
|
||||
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
|
||||
|
||||
<book id='bsp-guide' lang='en'
|
||||
<book id='bsp-guide' lang='en'
|
||||
xmlns:xi="http://www.w3.org/2003/XInclude"
|
||||
xmlns="http://docbook.org/ns/docbook"
|
||||
>
|
||||
@@ -10,15 +10,13 @@
|
||||
|
||||
<mediaobject>
|
||||
<imageobject>
|
||||
<imagedata fileref='figures/bsp-title.png'
|
||||
format='SVG'
|
||||
<imagedata fileref='figures/bsp-title.png'
|
||||
format='SVG'
|
||||
align='center' scalefit='1' width='100%'/>
|
||||
</imageobject>
|
||||
</imageobject>
|
||||
</mediaobject>
|
||||
|
||||
<title>
|
||||
Yocto Project Board Support Package Developer's Guide
|
||||
</title>
|
||||
<title></title>
|
||||
|
||||
<authorgroup>
|
||||
<author>
|
||||
@@ -68,36 +66,6 @@
|
||||
<date>October 2012</date>
|
||||
<revremark>Released with the Yocto Project 1.3 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4</revnumber>
|
||||
<date>April 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.1</revnumber>
|
||||
<date>June 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4.1 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.2</revnumber>
|
||||
<date>August 2013</date>
|
||||
<revremark>Released with the Yocto Project 1.4.2 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.3</revnumber>
|
||||
<date>March 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.3 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.4</revnumber>
|
||||
<date>May 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.4 Release.</revremark>
|
||||
</revision>
|
||||
<revision>
|
||||
<revnumber>1.4.5</revnumber>
|
||||
<date>July 2014</date>
|
||||
<revremark>Released with the Yocto Project 1.4.5 Release.</revremark>
|
||||
</revision>
|
||||
</revhistory>
|
||||
|
||||
<copyright>
|
||||
@@ -107,12 +75,12 @@
|
||||
|
||||
<legalnotice>
|
||||
<para>
|
||||
Permission is granted to copy, distribute and/or modify this document under
|
||||
Permission is granted to copy, distribute and/or modify this document under
|
||||
the terms of the <ulink type="http" url="http://creativecommons.org/licenses/by-nc-sa/2.0/uk/">Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales</ulink> as published by Creative Commons.
|
||||
</para>
|
||||
<note>
|
||||
Due to production processes, there could be differences between the Yocto Project
|
||||
documentation bundled in the release tarball and the
|
||||
documentation bundled in the release tarball and the
|
||||
<ulink url='&YOCTO_DOCS_BSP_URL;'>Yocto Project Board Support Package (BSP) Developer's Guide</ulink> on
|
||||
the <ulink url='&YOCTO_HOME_URL;'>Yocto Project</ulink> website.
|
||||
For the latest version of this manual, see the manual on the website.
|
||||
@@ -129,6 +97,6 @@
|
||||
-->
|
||||
|
||||
</book>
|
||||
<!--
|
||||
vim: expandtab tw=80 ts=4
|
||||
<!--
|
||||
vim: expandtab tw=80 ts=4
|
||||
-->
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -3,8 +3,6 @@
|
||||
|
||||
<xsl:import href="http://docbook.sourceforge.net/release/xsl/current/xhtml/docbook.xsl" />
|
||||
|
||||
<xsl:param name="html.stylesheet" select="'dev-style.css'" />
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
<!-- <xsl:param name="generate.toc" select="'article nop'"></xsl:param> -->
|
||||
|
||||
</xsl:stylesheet>
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
<?xml version='1.0'?>
|
||||
<xsl:stylesheet
|
||||
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
|
||||
xmlns="http://www.w3.org/1999/xhtml"
|
||||
xmlns:fo="http://www.w3.org/1999/XSL/Format"
|
||||
version="1.0">
|
||||
|
||||
<xsl:import
|
||||
href="http://docbook.sourceforge.net/release/xsl/current/eclipse/eclipse3.xsl" />
|
||||
|
||||
<xsl:param name="chunker.output.indent" select="'yes'"/>
|
||||
<xsl:param name="chunk.quietly" select="1"/>
|
||||
<xsl:param name="chunk.first.sections" select="1"/>
|
||||
<xsl:param name="chunk.section.depth" select="10"/>
|
||||
<xsl:param name="use.id.as.filename" select="1"/>
|
||||
<xsl:param name="ulink.target" select="'_self'" />
|
||||
<xsl:param name="base.dir" select="'html/dev-manual/'"/>
|
||||
<xsl:param name="html.stylesheet" select="'../book.css'"/>
|
||||
<xsl:param name="eclipse.manifest" select="0"/>
|
||||
<xsl:param name="create.plugin.xml" select="0"/>
|
||||
<xsl:param name="suppress.navigation" select="1"/>
|
||||
<xsl:param name="generate.index" select="0"/>
|
||||
<xsl:param name="chapter.autolabel" select="1" />
|
||||
<xsl:param name="appendix.autolabel" select="1" />
|
||||
<xsl:param name="section.autolabel" select="1" />
|
||||
<xsl:param name="section.label.includes.component.label" select="1" />
|
||||
</xsl:stylesheet>
|
||||
@@ -10,61 +10,43 @@
|
||||
|
||||
<para>
|
||||
Welcome to the Yocto Project Development Manual!
|
||||
This manual provides information on how to use the Yocto Project to
|
||||
develop embedded Linux images and user-space applications that
|
||||
run on targeted devices.
|
||||
The manual provides an overview of image, kernel, and
|
||||
user-space application development using the Yocto Project.
|
||||
Because much of the information in this manual is general, it
|
||||
contains many references to other sources where you can find more
|
||||
detail.
|
||||
For example, you can find detailed information on Git, repositories,
|
||||
and open source in general in many places on the Internet.
|
||||
Another example specific to the Yocto Project is how to quickly
|
||||
set up your host development system and build an image, which you
|
||||
find in the
|
||||
<ulink url='&YOCTO_DOCS_QS_URL;'>Yocto Project Quick Start</ulink>.
|
||||
<note>
|
||||
By default, using the Yocto Project creates a Poky distribution.
|
||||
However, you can create your own distribution by providing key
|
||||
<link linkend='metadata'>Metadata</link>.
|
||||
A good example is Angstrom, which has had a distribution
|
||||
based on the Yocto Project since its inception.
|
||||
Other examples include commercial distributions like
|
||||
Wind River Linux, Mentor Embedded Linux, and ENEA Linux.
|
||||
See the "<link linkend='creating-your-own-distribution'>Creating Your Own Distribution</link>"
|
||||
section for more information.
|
||||
</note>
|
||||
This manual gives you an idea of how to use the Yocto Project to develop embedded Linux
|
||||
images and user-space applications to run on targeted devices.
|
||||
Reading this manual gives you an overview of image, kernel, and user-space application development
|
||||
using the Yocto Project.
|
||||
Because much of the information in this manual is general, it contains many references to other
|
||||
sources where you can find more detail.
|
||||
For example, detailed information on Git, repositories and open source in general
|
||||
can be found in many places.
|
||||
Another example is how to get set up to use the Yocto Project, which our Yocto Project
|
||||
Quick Start covers.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The Yocto Project Development Manual does, however, provide
|
||||
guidance and examples on how to change the kernel source code,
|
||||
reconfigure the kernel, and develop an application using the
|
||||
popular <trademark class='trade'>Eclipse</trademark> IDE.
|
||||
<para>
|
||||
The Yocto Project Development Manual, however, does provide detailed examples
|
||||
on how to change the kernel source code, reconfigure the kernel, and develop
|
||||
an application using the popular <trademark class='trade'>Eclipse</trademark> IDE.
|
||||
</para>
|
||||
</section>
|
||||
|
||||
<section id='what-this-manual-provides'>
|
||||
<title>What This Manual Provides</title>
|
||||
<title>What this Manual Provides</title>
|
||||
|
||||
<para>
|
||||
The following list describes what you can get from this guide:
|
||||
<itemizedlist>
|
||||
<listitem><para>Information that lets you get set
|
||||
<listitem><para>Information that lets you get set
|
||||
up to develop using the Yocto Project.</para></listitem>
|
||||
<listitem><para>Information to help developers who are new to
|
||||
the open source environment and to the distributed revision
|
||||
control system Git, which the Yocto Project uses.
|
||||
</para></listitem>
|
||||
<listitem><para>An understanding of common end-to-end
|
||||
development models and tasks.</para></listitem>
|
||||
<listitem><para>Information about common development tasks
|
||||
generally used during image development for
|
||||
embedded devices.
|
||||
</para></listitem>
|
||||
<listitem><para>Many references to other sources of related
|
||||
information.</para></listitem>
|
||||
<listitem><para>Information to help developers who are new to the open source environment
|
||||
and to the distributed revision control system Git, which the Yocto Project
|
||||
uses.</para></listitem>
|
||||
<listitem><para>An understanding of common end-to-end development models and tasks.</para></listitem>
|
||||
<listitem><para>Development case overviews for both system development and user-space
|
||||
applications.</para></listitem>
|
||||
<listitem><para>An overview and understanding of the emulation environment used with
|
||||
the Yocto Project - the Quick EMUlator (QEMU).</para></listitem>
|
||||
<listitem><para>An understanding of basic kernel architecture and concepts.</para></listitem>
|
||||
<listitem><para>Many references to other sources of related information.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
</section>
|
||||
@@ -75,18 +57,18 @@
|
||||
<para>
|
||||
This manual will not give you the following:
|
||||
<itemizedlist>
|
||||
<listitem><para>Step-by-step instructions if those instructions exist in other Yocto
|
||||
Project documentation.
|
||||
For example, the Yocto Project Application Developer's Guide contains detailed
|
||||
instruction on how to run the
|
||||
<listitem><para>Step-by-step instructions if those instructions exist in other Yocto
|
||||
Project documentation.
|
||||
For example, the Yocto Project Application Developer's Guide contains detailed
|
||||
instruction on how to run the
|
||||
<ulink url='&YOCTO_DOCS_ADT_URL;#installing-the-adt'>Installing the ADT and Toolchains</ulink>,
|
||||
which is used to set up a cross-development environment.</para></listitem>
|
||||
<listitem><para>Reference material.
|
||||
This type of material resides in an appropriate reference manual.
|
||||
For example, system variables are documented in the
|
||||
<listitem><para>Reference material.
|
||||
This type of material resides in an appropriate reference manual.
|
||||
For example, system variables are documented in the
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;'>Yocto Project Reference Manual</ulink>.</para></listitem>
|
||||
<listitem><para>Detailed public information that is not specific to the Yocto Project.
|
||||
For example, exhaustive information on how to use Git is covered better through the
|
||||
<listitem><para>Detailed public information that is not specific to the Yocto Project.
|
||||
For example, exhaustive information on how to use Git is covered better through the
|
||||
Internet than in this manual.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
@@ -94,113 +76,111 @@
|
||||
|
||||
<section id='other-information'>
|
||||
<title>Other Information</title>
|
||||
|
||||
|
||||
<para>
|
||||
Because this manual presents overview information for many different topics, you will
|
||||
need to supplement it with other information.
|
||||
The following list presents other sources of information you might find helpful:
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis><ulink url='&YOCTO_HOME_URL;'>Yocto Project Website</ulink>:
|
||||
</emphasis> The home page for the Yocto Project provides lots of information on the project
|
||||
<listitem><para><emphasis>The <ulink url='&YOCTO_HOME_URL;'>Yocto Project Website</ulink>:
|
||||
</emphasis> The home page for the Yocto Project provides lots of information on the project
|
||||
as well as links to software and documentation.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_QS_URL;'>Yocto Project Quick Start</ulink>:</emphasis> This short document lets you get started
|
||||
<ulink url='&YOCTO_DOCS_QS_URL;'>Yocto Project Quick Start</ulink>:</emphasis> This short document lets you get started
|
||||
with the Yocto Project quickly and start building an image.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;'>Yocto Project Reference Manual</ulink>:</emphasis> This manual is a reference
|
||||
guide to the OpenEmbedded build system known as "Poky."
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_REF_URL;'>Yocto Project Reference Manual</ulink>:</emphasis> This manual is a reference
|
||||
guide to the OpenEmbedded build system known as "Poky."
|
||||
The manual also contains a reference chapter on Board Support Package (BSP)
|
||||
layout.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_ADT_URL;'>Yocto Project Application Developer's Guide</ulink>:</emphasis>
|
||||
This guide provides information that lets you get going with the Application
|
||||
Development Toolkit (ADT) and stand-alone cross-development toolchains to
|
||||
This guide provides information that lets you get going with the Application
|
||||
Development Toolkit (ADT) and stand-alone cross-development toolchains to
|
||||
develop projects using the Yocto Project.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_BSP_URL;'>Yocto Project Board Support Package (BSP) Developer's Guide</ulink>:</emphasis>
|
||||
This guide defines the structure for BSP components.
|
||||
This guide defines the structure for BSP components.
|
||||
Having a commonly understood structure encourages standardization.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_KERNEL_DEV_URL;'>Yocto Project Linux Kernel Development Manual</ulink>:</emphasis>
|
||||
This manual describes how to work with Linux Yocto kernels as well as provides a bit
|
||||
of conceptual information on the construction of the Yocto Linux kernel tree.
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_DOCS_PROF_URL;'>Yocto Project Profiling and Tracing Manual</ulink>:</emphasis>
|
||||
This manual presents a set of common and generally useful tracing and
|
||||
profiling schemes along with their applications (as appropriate) to each tool.
|
||||
</para></listitem>
|
||||
<ulink url='&YOCTO_DOCS_KERNEL_URL;'>Yocto Project Kernel Architecture and Use Manual</ulink>:</emphasis>
|
||||
This manual describes the architecture of the Yocto Project kernel and provides
|
||||
some work flow examples.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='http://www.youtube.com/watch?v=3ZlOu-gLsh0'>
|
||||
Eclipse IDE Yocto Plug-in</ulink>:</emphasis> A step-by-step instructional video that
|
||||
demonstrates how an application developer uses Yocto Plug-in features within
|
||||
demonstrates how an application developer uses Yocto Plug-in features within
|
||||
the Eclipse IDE.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_WIKI_URL;/wiki/FAQ'>FAQ</ulink>:</emphasis>
|
||||
A list of commonly asked questions and their answers.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_RELEASE_NOTES;'>Release Notes</ulink>:</emphasis>
|
||||
Features, updates and known issues for the current
|
||||
<ulink url='&YOCTO_HOME_URL;/download/yocto/yocto-project-&DISTRO;-release-notes-poky-&POKYVERSION;'>
|
||||
Release Notes</ulink>:</emphasis> Features, updates and known issues for the current
|
||||
release of the Yocto Project.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_HOME_URL;/tools-resources/projects/hob'>
|
||||
Hob</ulink>:</emphasis> A graphical user interface for BitBake.
|
||||
<ulink url='&YOCTO_HOME_URL;/projects/hob'>
|
||||
Hob</ulink>:</emphasis> A graphical user interface for BitBake.
|
||||
Hob's primary goal is to enable a user to perform common tasks more easily.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_HOME_URL;/download/build-appliance-0'>
|
||||
Build Appliance</ulink>:</emphasis> A virtual machine that
|
||||
enables you to build and boot a custom embedded Linux image
|
||||
with the Yocto Project using a non-Linux development system.
|
||||
For more information, see the
|
||||
<ulink url='&YOCTO_HOME_URL;/documentation/build-appliance-manual'>Build Appliance</ulink>
|
||||
page.
|
||||
</para></listitem>
|
||||
<ulink url='&YOCTO_HOME_URL;/documentation/build-appliance'>
|
||||
Build Appliance</ulink>:</emphasis> A bootable custom embedded Linux image you can
|
||||
either build using a non-Linux development system (VMware applications) or download
|
||||
from the Yocto Project website.
|
||||
See the <ulink url='&YOCTO_HOME_URL;/documentation/build-appliance'>Build Appliance</ulink>
|
||||
page for more information.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&YOCTO_BUGZILLA_URL;'>Bugzilla</ulink>:</emphasis>
|
||||
The bug tracking application the Yocto Project uses.
|
||||
If you find problems with the Yocto Project, you should report them using this
|
||||
application.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
Yocto Project Mailing Lists:</emphasis> To subscribe to the Yocto Project mailing
|
||||
Yocto Project Mailing Lists:</emphasis> To subscribe to the Yocto Project mailing
|
||||
lists, click on the following URLs and follow the instructions:
|
||||
<itemizedlist>
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo/yocto'></ulink> for a
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo/yocto'></ulink> for a
|
||||
Yocto Project Discussions mailing list.</para></listitem>
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo/poky'></ulink> for a
|
||||
Yocto Project Discussions mailing list about the
|
||||
OpenEmbedded build system (Poky).
|
||||
</para></listitem>
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo/poky'></ulink> for a
|
||||
Yocto Project Discussions mailing list about the Poky build system.</para></listitem>
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo/yocto-announce'></ulink>
|
||||
for a mailing list to receive official Yocto Project announcements for developments and
|
||||
as well as Yocto Project milestones.</para></listitem>
|
||||
<listitem><para><ulink url='&YOCTO_LISTS_URL;/listinfo'></ulink> for a
|
||||
listing of all public mailing lists on <filename>lists.yoctoproject.org</filename>.
|
||||
</para></listitem>
|
||||
</itemizedlist></para></listitem>
|
||||
<listitem><para><emphasis>Internet Relay Chat (IRC):</emphasis>
|
||||
Two IRC channels on freenode are available
|
||||
for Yocto Project and Poky discussions: <filename>#yocto</filename> and
|
||||
Two IRC channels on freenode are available
|
||||
for Yocto Project and Poky discussions: <filename>#yocto</filename> and
|
||||
<filename>#poky</filename>, respectively.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&OH_HOME_URL;'>OpenedHand</ulink>:</emphasis>
|
||||
The company that initially developed the Poky project, which is the basis
|
||||
for the OpenEmbedded build system used by the Yocto Project.
|
||||
OpenedHand was acquired by Intel Corporation in 2008.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='http://www.intel.com/'>Intel Corporation</ulink>:</emphasis>
|
||||
A multinational semiconductor chip manufacturer company whose Software and
|
||||
Services Group created and supports the Yocto Project.
|
||||
Intel acquired OpenedHand in 2008.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='&OE_HOME_URL;'>OpenEmbedded</ulink>:</emphasis>
|
||||
The build system used by the Yocto Project.
|
||||
This project is the upstream, generic, embedded distribution from which the Yocto
|
||||
The build system used by the Yocto Project.
|
||||
This project is the upstream, generic, embedded distribution from which the Yocto
|
||||
Project derives its build system (Poky) from and to which it contributes.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='http://developer.berlios.de/projects/bitbake/'>
|
||||
BitBake</ulink>:</emphasis> The tool used by the OpenEmbedded build system
|
||||
BitBake</ulink>:</emphasis> The tool used by the OpenEmbedded build system
|
||||
to process project metadata.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
BitBake User Manual:</emphasis>
|
||||
A comprehensive guide to the BitBake tool.
|
||||
If you want information on BitBake, see the user manual included in the
|
||||
<filename>bitbake/doc/manual</filename> directory of the
|
||||
If you want information on BitBake, see the user manual inculded in the
|
||||
<filename>bitbake/doc/manual</filename> directory of the
|
||||
<link linkend='source-directory'>Source Directory</link>.</para></listitem>
|
||||
<listitem><para><emphasis>
|
||||
<ulink url='http://wiki.qemu.org/Index.html'>Quick EMUlator (QEMU)</ulink>:
|
||||
</emphasis> An open-source machine emulator and virtualizer.</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
</section>
|
||||
</section>
|
||||
</chapter>
|
||||
<!--
|
||||
vim: expandtab tw=80 ts=4
|
||||
|
||||
@@ -7,16 +7,16 @@
|
||||
<title>Kernel Modification Example</title>
|
||||
|
||||
<para>
|
||||
Kernel modification involves changing or adding configurations to an existing kernel,
|
||||
changing or adding recipes to the kernel that are needed to support specific hardware features,
|
||||
or even altering the source code itself.
|
||||
This appendix presents simple examples that modify the kernel source code,
|
||||
Kernel modification involves changing or adding configurations to an existing kernel,
|
||||
changing or adding recipes to the kernel that are needed to support specific hardware features,
|
||||
or even altering the source code itself.
|
||||
This appendix presents simple examples that modify the kernel source code,
|
||||
change the kernel configuration, and add a kernel source recipe.
|
||||
<note>
|
||||
You can use the <filename>yocto-kernel</filename> script
|
||||
found in the <link linkend='source-directory'>Source Directory</link>
|
||||
under <filename>scripts</filename> to manage kernel patches and configuration.
|
||||
See the "<ulink url='&YOCTO_DOCS_BSP_URL;#managing-kernel-patches-and-config-items-with-yocto-kernel'>Managing kernel Patches and Config Items with yocto-kernel</ulink>"
|
||||
See the "<ulink url='&YOCTO_DOCS_BSP_URL;#managing-kernel-patches-and-config-items-with-yocto-kernel'>Managing kernel Patches and Config Items with yocto-kernel</ulink>"
|
||||
section in the Yocto Project Board Support Packages (BSP) Developer's Guide for
|
||||
more information.</note>
|
||||
</para>
|
||||
@@ -25,87 +25,87 @@
|
||||
<title>Modifying the Kernel Source Code</title>
|
||||
|
||||
<para>
|
||||
This example adds some simple QEMU emulator console output at boot time by
|
||||
This example adds some simple QEMU emulator console output at boot time by
|
||||
adding <filename>printk</filename> statements to the kernel's
|
||||
<filename>calibrate.c</filename> source code file.
|
||||
Booting the modified image causes the added messages to appear on the emulator's
|
||||
console.
|
||||
</para>
|
||||
</para>
|
||||
|
||||
<section id='understanding-the-files-you-need'>
|
||||
<title>Understanding the Files You Need</title>
|
||||
|
||||
|
||||
<para>
|
||||
Before you modify the kernel, you need to know what Git repositories and file
|
||||
Before you modify the kernel, you need to know what Git repositories and file
|
||||
structures you need.
|
||||
Briefly, you need the following:
|
||||
Briefly, you need the following:
|
||||
<itemizedlist>
|
||||
<listitem><para>A local
|
||||
<link linkend='source-directory'>Source Directory</link> for the
|
||||
<listitem><para>A local
|
||||
<link linkend='source-directory'>Source Directory</link> for the
|
||||
poky Git repository</para></listitem>
|
||||
<listitem><para>Local copies of the
|
||||
<link linkend='poky-extras-repo'><filename>poky-extras</filename></link>
|
||||
Git repository placed within the Source Directory.</para></listitem>
|
||||
<listitem><para>A bare clone of the
|
||||
<link linkend='local-kernel-files'>Yocto Project Kernel</link> upstream Git
|
||||
Git repository placed within the Source Directory.</para></listitem>
|
||||
<listitem><para>A bare clone of the
|
||||
<link linkend='local-kernel-files'>Yocto Project Kernel</link> upstream Git
|
||||
repository to which you want to push your modifications.
|
||||
</para></listitem>
|
||||
<listitem><para>A copy of that bare clone in which you make your source
|
||||
<listitem><para>A copy of that bare clone in which you make your source
|
||||
modifications</para></listitem>
|
||||
</itemizedlist>
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The following figure summarizes these four areas.
|
||||
Within each rectangular that represents a data structure, a
|
||||
host development directory pathname appears at the
|
||||
lower left-hand corner of the box.
|
||||
<para>
|
||||
The following figure summarizes these four areas.
|
||||
Within each rectangular that represents a data structure, a
|
||||
host development directory pathname appears at the
|
||||
lower left-hand corner of the box.
|
||||
These pathnames are the locations used in this example.
|
||||
The figure also provides key statements and commands used during the kernel
|
||||
modification process:
|
||||
</para>
|
||||
|
||||
<para>
|
||||
<imagedata fileref="figures/kernel-example-repos-generic.png" width="7in" depth="5in"
|
||||
<imagedata fileref="figures/kernel-example-repos-generic.png" width="7in" depth="5in"
|
||||
align="center" scale="100" />
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Here is a brief description of the four areas:
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis>Local Source Directory:</emphasis>
|
||||
This area contains all the metadata that supports building images
|
||||
<listitem><para><emphasis>Local Source Directory:</emphasis>
|
||||
This area contains all the metadata that supports building images
|
||||
using the OpenEmbedded build system.
|
||||
In this example, the
|
||||
<link linkend='source-directory'>Source Directory</link> also
|
||||
contains the
|
||||
<link linkend='build-directory'>Build Directory</link>,
|
||||
which contains the configuration directory
|
||||
In this example, the
|
||||
<link linkend='source-directory'>Source Directory</link> also
|
||||
contains the
|
||||
<link linkend='build-directory'>Build Directory</link>,
|
||||
which contains the configuration directory
|
||||
that lets you control the build.
|
||||
Also in this example, the Source Directory contains local copies of the
|
||||
Also in this example, the Source Directory contains local copies of the
|
||||
<filename>poky-extras</filename> Git repository.</para>
|
||||
<para>See the bulleted item
|
||||
"<link linkend='local-yp-release'>Yocto Project Release</link>"
|
||||
for information on how to get these files on your local system.</para></listitem>
|
||||
<listitem><para><emphasis>Local copies of the <filename>poky-extras</filename> Git Repository:</emphasis>
|
||||
This area contains the <filename>meta-kernel-dev</filename> layer,
|
||||
<listitem><para><emphasis>Local copies of the <filename>poky-extras</filename> Git Repository:</emphasis>
|
||||
This area contains the <filename>meta-kernel-dev</filename> layer,
|
||||
which is where you make changes that append the kernel build recipes.
|
||||
You edit <filename>.bbappend</filename> files to locate your
|
||||
You edit <filename>.bbappend</filename> files to locate your
|
||||
local kernel source files and to identify the kernel being built.
|
||||
This Git repository is a gathering place for extensions to the Yocto Project
|
||||
(or really any) kernel recipes that faciliate the creation and development
|
||||
of kernel features, BSPs or configurations.</para>
|
||||
<para>See the bulleted item
|
||||
"<link linkend='poky-extras-repo'>The
|
||||
<filename>poky-extras</filename> Git Repository</link>"
|
||||
for information on how to get these files.</para></listitem>
|
||||
<listitem><para><emphasis>Bare Clone of the Yocto Project kernel:</emphasis>
|
||||
This bare Git repository tracks the upstream Git repository of the Linux
|
||||
"<link linkend='poky-extras-repo'>The
|
||||
<filename>poky-extras</filename> Git Repository</link>"
|
||||
for information on how to get these files.</para></listitem>
|
||||
<listitem><para><emphasis>Bare Clone of the Yocto Project kernel:</emphasis>
|
||||
This bare Git repository tracks the upstream Git repository of the Linux
|
||||
Yocto kernel source code you are changing.
|
||||
When you modify the kernel you must work through a bare clone.
|
||||
All source code changes you make to the kernel must be committed and
|
||||
All source code changes you make to the kernel must be committed and
|
||||
pushed to the bare clone using Git commands.
|
||||
As mentioned, the <filename>.bbappend</filename> file in the
|
||||
As mentioned, the <filename>.bbappend</filename> file in the
|
||||
<filename>poky-extras</filename> repository points to the bare clone
|
||||
so that the build process can locate the locally changed source files.</para>
|
||||
<para>See the bulleted item
|
||||
@@ -113,16 +113,16 @@
|
||||
for information on how to set up the bare clone.
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>Copy of the Yocto Project Kernel Bare Clone:</emphasis>
|
||||
This Git repository contains the actual source files that you modify.
|
||||
This Git repository contains the actual source files that you modify.
|
||||
Any changes you make to files in this location need to ultimately be pushed
|
||||
to the bare clone using the <filename>git push</filename> command.</para>
|
||||
<para>See the bulleted item
|
||||
"<link linkend='local-kernel-files'>Yocto Project Kernel</link>"
|
||||
for information on how to set up the bare clone.
|
||||
<note>Typically, Git workflows follow a scheme where changes made to a local area
|
||||
<note>Typically, Git workflows follow a scheme where changes made to a local area
|
||||
are pulled into a Git repository.
|
||||
However, because the <filename>git pull</filename> command does not work
|
||||
with bare clones, this workflow pushes changes to the
|
||||
However, because the <filename>git pull</filename> command does not work
|
||||
with bare clones, this workflow pushes changes to the
|
||||
repository even though you could use other more complicated methods to
|
||||
get changes into the bare clone.</note>
|
||||
</para></listitem>
|
||||
@@ -134,11 +134,11 @@
|
||||
<title>Setting Up the Local Source Directory</title>
|
||||
|
||||
<para>
|
||||
You can set up the
|
||||
You can set up the
|
||||
<link linkend='source-directory'>Source Directory</link>
|
||||
through tarball extraction or by
|
||||
cloning the <filename>poky</filename> Git repository.
|
||||
This example uses <filename>poky</filename> as the root directory of the
|
||||
through tarball extraction or by
|
||||
cloning the <filename>poky</filename> Git repository.
|
||||
This example uses <filename>poky</filename> as the root directory of the
|
||||
local Source Directory.
|
||||
See the bulleted item
|
||||
"<link linkend='local-yp-release'>Yocto Project Release</link>"
|
||||
@@ -146,17 +146,17 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Once you have Source Directory set up,
|
||||
you have many development branches from which you can work.
|
||||
From inside the local repository you can see the branch names and the tag names used
|
||||
Once you have Source Directory set up,
|
||||
you have many development branches from which you can work.
|
||||
From inside the local repository you can see the branch names and the tag names used
|
||||
in the upstream Git repository by using either of the following commands:
|
||||
<literallayout class='monospaced'>
|
||||
$ cd poky
|
||||
$ git branch -a
|
||||
$ git tag -l
|
||||
</literallayout>
|
||||
This example uses the Yocto Project &DISTRO; Release code named "&DISTRO_NAME;",
|
||||
which maps to the <filename>&DISTRO_NAME;</filename> branch in the repository.
|
||||
</literallayout>
|
||||
This example uses the Yocto Project &DISTRO; Release code named "&DISTRO_NAME;",
|
||||
which maps to the <filename>&DISTRO_NAME;</filename> branch in the repository.
|
||||
The following commands create and checkout the local <filename>&DISTRO_NAME;</filename>
|
||||
branch:
|
||||
<literallayout class='monospaced'>
|
||||
@@ -171,20 +171,20 @@
|
||||
<title>Setting Up the Local poky-extras Git Repository</title>
|
||||
|
||||
<para>
|
||||
This example creates a local copy of the <filename>poky-extras</filename> Git
|
||||
This example creates a local copy of the <filename>poky-extras</filename> Git
|
||||
repository inside the <filename>poky</filename> Source Directory.
|
||||
See the bulleted item "<link linkend='poky-extras-repo'>The
|
||||
<filename>poky-extras</filename> Git Repository</link>"
|
||||
See the bulleted item "<link linkend='poky-extras-repo'>The
|
||||
<filename>poky-extras</filename> Git Repository</link>"
|
||||
for information on how to set up a local copy of the
|
||||
<filename>poky-extras</filename> repository.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Because this example uses the Yocto Project &DISTRO; Release code
|
||||
named "&DISTRO_NAME;", which maps to the <filename>&DISTRO_NAME;</filename>
|
||||
branch in the repository, you need to be sure you are using that
|
||||
Because this example uses the Yocto Project &DISTRO; Release code
|
||||
named "&DISTRO_NAME;", which maps to the <filename>&DISTRO_NAME;</filename>
|
||||
branch in the repository, you need to be sure you are using that
|
||||
branch for <filename>poky-extras</filename>.
|
||||
The following commands create and checkout the local
|
||||
The following commands create and checkout the local
|
||||
branch you are using for the <filename>&DISTRO_NAME;</filename>
|
||||
branch:
|
||||
<literallayout class='monospaced'>
|
||||
@@ -201,25 +201,25 @@
|
||||
|
||||
<para>
|
||||
This example modifies the <filename>linux-yocto-3.4</filename> kernel.
|
||||
Thus, you need to create a bare clone of that kernel and then make a copy of the
|
||||
Thus, you need to create a bare clone of that kernel and then make a copy of the
|
||||
bare clone.
|
||||
See the bulleted item
|
||||
"<link linkend='local-kernel-files'>Yocto Project Kernel</link>"
|
||||
"<link linkend='local-kernel-files'>Yocto Project Kernel</link>"
|
||||
for information on how to do that.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
The bare clone exists for the kernel build tools and simply as the receiving end
|
||||
The bare clone exists for the kernel build tools and simply as the receiving end
|
||||
of <filename>git push</filename>
|
||||
commands after you make edits and commits inside the copy of the clone.
|
||||
The copy (<filename>my-linux-yocto-3.4-work</filename> in this example) has to have
|
||||
The copy (<filename>my-linux-yocto-3.4-work</filename> in this example) has to have
|
||||
a local branch created and checked out for your work.
|
||||
This example uses <filename>common-pc-base</filename> as the local branch.
|
||||
The following commands create and checkout the branch:
|
||||
<literallayout class='monospaced'>
|
||||
$ cd ~/my-linux-yocto-3.4-work
|
||||
$ git checkout -b standard-common-pc-base origin/standard/common-pc/base
|
||||
Branch standard-common-pc-base set up to track remote branch
|
||||
Branch standard-common-pc-base set up to track remote branch
|
||||
standard/common-pc/base from origin.
|
||||
Switched to a new branch 'standard-common-pc-base'
|
||||
</literallayout>
|
||||
@@ -230,22 +230,22 @@
|
||||
<title>Building and Booting the Default QEMU Kernel Image</title>
|
||||
|
||||
<para>
|
||||
Before we make changes to the kernel source files, this example first builds the
|
||||
Before we make changes to the kernel source files, this example first builds the
|
||||
default image and then boots it inside the QEMU emulator.
|
||||
<note>
|
||||
Because a full build can take hours, you should check two variables in the
|
||||
<filename>build</filename> directory that is created after you source the
|
||||
Because a full build can take hours, you should check two variables in the
|
||||
<filename>build</filename> directory that is created after you source the
|
||||
<filename>&OE_INIT_FILE;</filename> script.
|
||||
You can find these variables
|
||||
<filename>BB_NUMBER_THREADS</filename> and <filename>PARALLEL_MAKE</filename>
|
||||
in the <filename>build/conf</filename> directory in the
|
||||
in the <filename>build/conf</filename> directory in the
|
||||
<filename>local.conf</filename> configuration file.
|
||||
By default, these variables are commented out.
|
||||
If your host development system supports multi-core and multi-thread capabilities,
|
||||
you can uncomment these statements and set the variables to significantly shorten
|
||||
the full build time.
|
||||
As a guideline, set both <filename>BB_NUMBER_THREADS</filename> and
|
||||
<filename>PARALLEL_MAKE</filename> to twice the number
|
||||
As a guideline, set both <filename>BB_NUMBER_THREADS</filename> and
|
||||
<filename>PARALLEL_MAKE</filename> to twice the number
|
||||
of cores your machine supports.
|
||||
</note>
|
||||
The following two commands <filename>source</filename> the build environment setup script
|
||||
@@ -255,9 +255,9 @@
|
||||
$ cd ~/poky
|
||||
$ source &OE_INIT_FILE;
|
||||
You had no conf/local.conf file. This configuration file has therefore been
|
||||
created for you with some default values. You may wish to edit it to use a
|
||||
different MACHINE (target hardware) or enable parallel build options to take
|
||||
advantage of multiple cores for example. See the file for more information as
|
||||
created for you with some default values. You may wish to edit it to use a
|
||||
different MACHINE (target hardware) or enable parallel build options to take
|
||||
advantage of multiple cores for example. See the file for more information as
|
||||
common configuration options are commented.
|
||||
|
||||
The Yocto Project has extensive documentation about OE including a reference manual
|
||||
@@ -305,7 +305,7 @@
|
||||
before starting the build.</note>
|
||||
</para>
|
||||
|
||||
<para>
|
||||
<para>
|
||||
After the build completes, you can start the QEMU emulator using the resulting image
|
||||
<filename>qemux86</filename> as follows:
|
||||
<literallayout class='monospaced'>
|
||||
@@ -317,9 +317,9 @@
|
||||
As the image boots in the emulator, console message and status output appears
|
||||
across the terminal window.
|
||||
Because the output scrolls by quickly, it is difficult to read.
|
||||
To examine the output, you log into the system using the
|
||||
To examine the output, you log into the system using the
|
||||
login <filename>root</filename> with no password.
|
||||
Once you are logged in, issue the following command to scroll through the
|
||||
Once you are logged in, issue the following command to scroll through the
|
||||
console output:
|
||||
<literallayout class='monospaced'>
|
||||
# dmesg | less
|
||||
@@ -360,7 +360,7 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Here is the altered code showing five new <filename>printk</filename> statements
|
||||
Here is the altered code showing five new <filename>printk</filename> statements
|
||||
near the top of the function:
|
||||
<literallayout class='monospaced'>
|
||||
void __cpuinit calibrate_delay(void)
|
||||
@@ -392,9 +392,9 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Once the source code has been modified, you need to use Git to push the changes to
|
||||
the bare clone.
|
||||
If you do not push the changes, then the OpenEmbedded build system will not pick
|
||||
Once the source code has been modified, you need to use Git to push the changes to
|
||||
the bare clone.
|
||||
If you do not push the changes, then the OpenEmbedded build system will not pick
|
||||
up the changed source files.
|
||||
</para>
|
||||
|
||||
@@ -411,43 +411,43 @@
|
||||
|
||||
<para>
|
||||
At this point, the source has been changed and pushed.
|
||||
The example now defines some variables used by the OpenEmbedded build system
|
||||
The example now defines some variables used by the OpenEmbedded build system
|
||||
to locate your kernel source.
|
||||
You essentially need to identify where to find the kernel recipe and the changed source code.
|
||||
You also need to be sure some basic configurations are in place that identify the
|
||||
You also need to be sure some basic configurations are in place that identify the
|
||||
type of machine you are building and to help speed up the build should your host support
|
||||
multiple-core and thread capabilities.
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Do the following to make sure the build parameters are set up for the example.
|
||||
Once you set up these build parameters, they do not have to change unless you
|
||||
Once you set up these build parameters, they do not have to change unless you
|
||||
change the target architecture of the machine you are building or you move
|
||||
the bare clone, copy of the clone, or the <filename>poky-extras</filename> repository:
|
||||
<itemizedlist>
|
||||
<listitem><para><emphasis>Build for the Correct Target Architecture:</emphasis> The
|
||||
<filename>local.conf</filename> file in the build directory defines the build's
|
||||
<listitem><para><emphasis>Build for the Correct Target Architecture:</emphasis> The
|
||||
<filename>local.conf</filename> file in the build directory defines the build's
|
||||
target architecture.
|
||||
By default, <filename>MACHINE</filename> is set to
|
||||
<filename>qemux86</filename>, which specifies a 32-bit
|
||||
<trademark class='registered'>Intel</trademark> Architecture
|
||||
target machine suitable for the QEMU emulator.
|
||||
By default, <filename>MACHINE</filename> is set to
|
||||
<filename>qemux86</filename>, which specifies a 32-bit
|
||||
<trademark class='registered'>Intel</trademark> Architecture
|
||||
target machine suitable for the QEMU emulator.
|
||||
In this example, <filename>MACHINE</filename> is correctly configured.
|
||||
</para></listitem>
|
||||
<listitem><para><emphasis>Optimize Build Time:</emphasis> Also in the
|
||||
<filename>local.conf</filename> file are two variables that can speed your
|
||||
<listitem><para><emphasis>Optimize Build Time:</emphasis> Also in the
|
||||
<filename>local.conf</filename> file are two variables that can speed your
|
||||
build time if your host supports multi-core and multi-thread capabilities:
|
||||
<filename>BB_NUMBER_THREADS</filename> and <filename>PARALLEL_MAKE</filename>.
|
||||
If the host system has multiple cores then you can optimize build time
|
||||
by setting both these variables to twice the number of
|
||||
If the host system has multiple cores then you can optimize build time
|
||||
by setting both these variables to twice the number of
|
||||
cores.</para></listitem>
|
||||
<listitem><para><emphasis>Identify Your <filename>meta-kernel-dev</filename>
|
||||
Layer:</emphasis> The <filename>BBLAYERS</filename> variable in the
|
||||
<filename>bblayers.conf</filename> file found in the
|
||||
Layer:</emphasis> The <filename>BBLAYERS</filename> variable in the
|
||||
<filename>bblayers.conf</filename> file found in the
|
||||
<filename>poky/build/conf</filename> directory needs to have the path to your local
|
||||
<filename>meta-kernel-dev</filename> layer.
|
||||
By default, the <filename>BBLAYERS</filename> variable contains paths to
|
||||
<filename>meta</filename> and <filename>meta-yocto</filename> in the
|
||||
<filename>meta-kernel-dev</filename> layer.
|
||||
By default, the <filename>BBLAYERS</filename> variable contains paths to
|
||||
<filename>meta</filename> and <filename>meta-yocto</filename> in the
|
||||
<filename>poky</filename> Git repository.
|
||||
Add the path to your <filename>meta-kernel-dev</filename> location.
|
||||
Be sure to substitute your user information in the statement.
|
||||
@@ -460,14 +460,14 @@
|
||||
/home/scottrif/poky/poky-extras/meta-kernel-dev \
|
||||
"
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para><emphasis>Identify Your Source Files:</emphasis> In the
|
||||
<filename>linux-yocto_3.4.bbappend</filename> file located in the
|
||||
<listitem><para><emphasis>Identify Your Source Files:</emphasis> In the
|
||||
<filename>linux-yocto_3.4.bbappend</filename> file located in the
|
||||
<filename>poky-extras/meta-kernel-dev/recipes-kernel/linux</filename>
|
||||
directory, you need to identify the location of the
|
||||
directory, you need to identify the location of the
|
||||
local source code, which in this example is the bare clone named
|
||||
<filename>linux-yocto-3.4.git</filename>.
|
||||
To do this, set the <filename>KSRC_linux_yocto</filename> variable to point to your
|
||||
local <filename>linux-yocto-3.4.git</filename> Git repository by adding the
|
||||
local <filename>linux-yocto-3.4.git</filename> Git repository by adding the
|
||||
following statement.
|
||||
Also, be sure the <filename>SRC_URI</filename> variable is pointing to
|
||||
your kernel source files by removing the comment.
|
||||
@@ -480,20 +480,20 @@
|
||||
</para>
|
||||
|
||||
<note>
|
||||
<para>Before attempting to build the modified kernel, there is one more set of changes you
|
||||
<para>Before attempting to build the modified kernel, there is one more set of changes you
|
||||
need to make in the <filename>meta-kernel-dev</filename> layer.
|
||||
Because all the kernel <filename>.bbappend</filename> files are parsed during the
|
||||
build process regardless of whether you are using them or not, you should either
|
||||
comment out the <filename>COMPATIBLE_MACHINE</filename> statements in all
|
||||
unused <filename>.bbappend</filename> files, or simply remove (or rename) all the files
|
||||
except the one your are using for the build
|
||||
Because all the kernel <filename>.bbappend</filename> files are parsed during the
|
||||
build process regardless of whether you are using them or not, you should either
|
||||
comment out the <filename>COMPATIBLE_MACHINE</filename> statements in all
|
||||
unused <filename>.bbappend</filename> files, or simply remove (or rename) all the files
|
||||
except the one your are using for the build
|
||||
(i.e. <filename>linux-yocto_3.4.bbappend</filename> in this example).</para>
|
||||
<para>If you do not make one of these two adjustments, your machine will be compatible
|
||||
with all the kernel recipes in the <filename>meta-kernel-dev</filename> layer.
|
||||
with all the kernel recipes in the <filename>meta-kernel-dev</filename> layer.
|
||||
When your machine is comapatible with all the kernel recipes, the build attempts
|
||||
to build all kernels in the layer.
|
||||
You could end up with build errors blocking your work.</para>
|
||||
</note>
|
||||
</note>
|
||||
</section>
|
||||
|
||||
<section id='building-and-booting-the-modified-qemu-kernel-image'>
|
||||
@@ -511,7 +511,7 @@
|
||||
$ source &OE_INIT_FILE;
|
||||
</literallayout>
|
||||
</para></listitem>
|
||||
<listitem><para>Be sure old images are cleaned out by running the
|
||||
<listitem><para>Be sure old images are cleaned out by running the
|
||||
<filename>cleanall</filename> BitBake task as follows from your build directory:
|
||||
<literallayout class='monospaced'>
|
||||
$ bitbake -c cleanall linux-yocto
|
||||
@@ -524,7 +524,7 @@
|
||||
<literallayout class='monospaced'>
|
||||
$ bitbake -k core-image-minimal
|
||||
</literallayout></para></listitem>
|
||||
<listitem><para>Finally, boot the modified image in the QEMU emulator
|
||||
<listitem><para>Finally, boot the modified image in the QEMU emulator
|
||||
using this command:
|
||||
<literallayout class='monospaced'>
|
||||
$ runqemu qemux86
|
||||
@@ -533,7 +533,7 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
Log into the machine using <filename>root</filename> with no password and then
|
||||
Log into the machine using <filename>root</filename> with no password and then
|
||||
use the following shell command to scroll through the console's boot output.
|
||||
<literallayout class='monospaced'>
|
||||
# dmesg | less
|
||||
@@ -541,7 +541,7 @@
|
||||
</para>
|
||||
|
||||
<para>
|
||||
You should see the results of your <filename>printk</filename> statements
|
||||
You should see the results of your <filename>printk</filename> statements
|
||||
as part of the output.
|
||||
</para>
|
||||
</section>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user