[tbot] [DENX] tbot: board hangs if no autoload

Heiko Schocher hs at denx.de
Thu Nov 15 10:44:57 UTC 2018

Hello Stefano,

Am 15.11.2018 um 11:23 schrieb Stefano Babic:
> Hi Heiko, Harald,
> On 15/11/18 10:49, Heiko Schocher wrote:
>> Hello Harald,
>> Am 15.11.2018 um 10:28 schrieb Harald Seiler:
>>> On Thu, 2018-11-15 at 10:23 +0100, Lukasz Majewski wrote:
>>>> On Thu, 15 Nov 2018 10:19:17 +0100
>>>> Harald Seiler <hws at denx.de> wrote:
>>>>> On Thu, 2018-11-15 at 10:10 +0100, Stefano Babic wrote:
>>>>>> Hi Harald,
>>>>>> On 15/11/18 09:36, Harald Seiler wrote:
>>>>>>> Hi Stefano!
>>>>>>> Yes, TBot waits for an autoboot prompt.  You can disable this by
>>>>>>> setting `autoboot_prompt` in your UBootMachine to the U-Boot
>>>>>>> prompt.
>>>>>>>      class MyUBoot(board.UBootMachine):
>>>>>>>          prompt = "=> "
>>>>>>>          autoboot_prompt = "=> "
>>>>>>> I know this is more of a hack
>>>>>> Yes, because it is not a fix property of the board. It depends if
>>>>>> autoload is active and bootcmd is set on the board. The "mira"
>>>>>> board has not "bootcmd" set in default environment, and behavior is
>>>>>> different just after setting bootcmd.
>>>>> Hmm, good point, I will think about it ...
>>>>>>> and I will add a proper way to do this in
>>>>>>> a future release
>>>>>> Nice !
>>>>>>> (keep an eye on the CHANGELOG, there will be a lot of
>>>>>>> small convenience features like this in the next versions!)
>>>>>> Do we have a repo for testcases with the new format ? I really
>>>>>> appreciate that new tbot has a clean split between software (tbot),
>>>>>> setup (boards and lab) and testcases. We have repos for the first
>>>>>> two cases, I cannot find a set of common tc. I mean, tc that are
>>>>>> already converted for @tbot.testcase - if not, I would start to
>>>>>> write myself, but I do not want to reinvent the wheel (and of
>>>>>> course, I will do more mistakes..)
>>>>> That is what my `tbot-denx` repo is supposed to be (DENX-Internal
>>>>> only):
>>>>>      https://gitlab.denx.de/HaraldSeiler/tbot-denx
>>>>> However, it doesn't contain any testcases yet.
>>>> Is there a way to convert (or directly re-use) Heiko's test cases?
>>> Unfortunately not, there is absolutely no compatibility between the two
>>> versions ... So it needs a human to do it.
>>> I guess, at the moment I am in the best position to do so, I just need
>>> input about which testcases have the highest priority to you.
>> Instead it may makes more sense to discuss what we want to test and
>> than how to write the testcases?
> Right, let's see.
>> Proposal what I have already:
>> U-Boot
>> - get sources
>> - may apply patches to it
>>    list of patches in a directory ?
>>    get a list of patches from patchwork todo list?
>> - install u-boot on the board
>> - check if really the new version of u-boot boots
> I admit that I am not very interested how to build and to push software
> on the board. There are a lot of different paths, and this is not what
> the customers ask. They have their own buildserver (Yocto / buildroot /
> Jenkins / custom / whatever...) and the process to get and applied
> patches is more interesting for a U-Boot maintainer as for a customer.

Yes ... why should developers not use tbot??
And if I have one command for doing all the boring stuff from
scratch ... this is nice. Also if you get at the end a documentation
with all the steps for the customer, how to reproduce this.

> If we start to convert how to install software on the board, we start
> with a lot of single different cases, because this is absolutely board
> specific.

Yes ... so write for the board specific part a board specific testcase
which is called from a generic part ...

> My vote goes to start with the more general cases, that is: Software is
> on the board, does the board work as expected ? Things like:
> - U-Boot:
> 	- does network work ?
> 	- does storage work ?
> 	- do other u-boot peripherals work ?

Of course also a valid starting point!

But also you must define a way, how to find out, what devices are
oon the board... I did for example "help date" and if this is
successfull, I can test the date command ...

Or parse help output and decide then?
Parse U-Boots config and/or DTS ?

> Such cases - they are unaware of which board is running, and we can at
> the early beginning have more general test cases. Same thing for linux,
> but of course we can have much more.

see above.

Also as you can call testcases from another testcase, you can write
a board specific testcase, in which you (as board maintainer) should
now, which generic testcases you can call ...

>> - create a register dump file
>>    write register content into a register dump file
>> - do register checks
>>    open register dump file and check if register content
>>    is the same as in the file
>> - convert all DUTS testcases
>>    http://git.denx.de/?p=duts.git
> I do not thing this is a great idea. This "duts" is obsolete and I think
> we have now a more generic and better concept with tbot. I think we
> should just have a list of test cases and then translate them in
> @tbot.testcase, without looking at the past. IMHO duts is quite broken
> and we should not care of it, it can just confuse us and it could be a
> waste of time.

But there are a lot of valid tests!

It is just an idea ... I converted some of them (not finished all)
and made based on the results a U-Boot cmdline documentation, as
we had with the DULG.

>>    goal, create at the end a u-boot commandline documentation
>> - call pytest from u-boot?
> Do we ?

I meant: call U-Boot testframework which is in "tools/py"
from tbot.

>> - if new u-boot does not boot, switch bootmode and unbreak it
> This is also very board specific and it does not always work. I prefer
> to start with a more generic approach.
> For example, start with testing network in U-Boot. How can I split
> between Lab setup and board setup ? Let's say the tftp server. I can set
> in the board file a "setenv serverip", but this is broken, because a
> board could belong to different Labs (I have a mira here and I have my
> own Lab setup). Is there a a way to do this ? Where should I look for
> such a cases ?

Than the serverip should be a lab specific variable.

>> Linux:
>> - get sources
>> - may apply patches to it
>> - install linux on the board
>> - check if booted version is the expected one
>> - create a register dump file
>>    write register content into a register dump file
>> - do register checks
> See above. I think this is useful during a porting, but it is less
> useful for a customer who wants to test functionalities. I would like to

I have here another opinion.

This is also interesting for a customer.

Which customer does never change a DTS or does not try a linux update
on his own?

If he have an automated check, if all important registers are setup
as expected ... this is nice.

This testcase could be done very generic...

> have first a catalog of testcases with functionalities, like:
> 	- is network working ?
> 	- are peripherals working (SPI / I2C /....) ?

Yes. My hope is, that we get a lot of users, so we will get a lot of
testcases ;-)

> In the ideal case, DT is parsed to get a list of testcases...


>>    open register dump file and check if register content
>>    is the same as in the file
>> - look if a list of string are in dmesg output
>> - look for example at the LTP project, what they test
> +1
> LTP contains a lot of useful testcases, but of course they are meant to
> run as scripts directly on the target / host. Anyway, they have
> testcases for a lot of things.

Yes, and we may can use this scripts! Start them and analyse the results.

>> - check if ptest-runner is in the rootfs and call it
> ptest-runner means python. Do we have it on most projects ? Some yes,
> some not...

Therefore is "check if ptest-runner" exists ;-)

>> ...
>> yocto:
>> - get the sources
>> - configure
>> - bake
>> - check if files you are interested in are created
>> - install new images
>> - boot them
>> - check if rootfsversion is correct
> See above - IMHO it is better to split between functional tests on
> target and build, and to start with the functional tests.

Of course. Both parts can be done independently

DENX Software Engineering GmbH,      Managing Director: Wolfgang Denk
HRB 165235 Munich, Office: Kirchenstr.5, D-82194 Groebenzell, Germany
Phone: +49-8142-66989-52   Fax: +49-8142-66989-80   Email: hs at denx.de

More information about the tbot mailing list