Mainline U-Boot with SPI, NVMe and SATA boot support
you have not disabled emmc by jumpering the sw4 switch https://wiki.pine64.org/index.php/ROCKPr...sable_eMMC?
irrelevant because kernel 4.4 works

you could eventually erase the spi with the ayufan spi or sigmaris erase image.
  Reply
(07-01-2021, 01:06 PM)LMM Wrote: you have not disabled emmc by jumpering the sw4 switch https://wiki.pine64.org/index.php/ROCKPr...sable_eMMC?
irrelevant because kernel 4.4 works

you could eventually erase the spi with the ayufan spi or sigmaris erase image.


I erased the spi and get the same result trying to boot from eMMC.
As expected booting from USB doesn't work anymore either.

reflashed the spi and:
- boot from eMMC fails again
- boot from USB works
  Reply
(07-03-2021, 04:47 AM)MisterA Wrote:
(07-01-2021, 01:06 PM)LMM Wrote: you have not disabled emmc by jumpering the sw4 switch https://wiki.pine64.org/index.php/ROCKPr...sable_eMMC?
irrelevant because kernel 4.4 works

you could eventually erase the spi with the ayufan spi or sigmaris erase image.


I erased the spi and get the same result trying to boot from eMMC.
As expected booting from USB doesn't work anymore either.

reflashed the spi and:
- boot from eMMC fails again
- boot from USB works

Last try, after i am stuck (but I am really pessimistic):
- launch armbian kernel 4.4 which is working
- copy u-boot to spi :
launch armbian-config
    in menu : system > hardware > Install >    Install/update the bootloader on spi flash

In this case the board will use the 4.4 bootloader since it is in written in the spi and then boot the 5.x kernel

An other try is to use the ayufan u-boot if the problem comes from the bootloader

my board version is v2.1 2018-07-02 which is newer than yours

You also may try with nvme if you have one (usb works, perhaps nvme also works). The ssd nvme speed is higher than emmc speed. But ssd nvme 256G is around 30$-40$ and the pcie adapter card is around 10$ ... and is is not sure it will work ! I tried samsung evo, and pny xlr8 they work well with the pine64 pcie adapter. I am not sure it worth trying.
  Reply
I managed to build mainline 2021.07 boot and get it running on the rockpro64.  Unfortunately, I'm not smart enough to tie in your features with the mainline (SATA boot, HDMI console support, etc).  Any pointers while I try to learn this?
  Reply
(07-08-2021, 04:55 PM)Mentaluproar Wrote: I managed to build mainline 2021.07 boot and get it running on the rockpro64.  Unfortunately, I'm not smart enough to tie in your features with the mainline (SATA boot, HDMI console support, etc).  Any pointers while I try to learn this?

I assume that you know how to use docker (at least the basics). For the compilation I built the docker image with the following Dockerfile:

Code:
FROM debian:bullseye

ENV DEBIAN_FRONTEND=noninteractive

RUN apt-get clean && apt-get update && \
    apt-get install -y \
    bc build-essential bison device-tree-compiler \
    flex git libssl-dev make libncurses5-dev lzop perl \
    crossbuild-essential-armhf \
    crossbuild-essential-arm64 \
    gcc-arm-none-eabi \
    gcc-aarch64-linux-gnu \
    python3 swig \
    dosfstools mtools parted nano bash-completion udev && \
    git config --global user.name "FIRST_NAME LAST_NAME" && \
    git config --global user.email "email@example.com"

WORKDIR /home/user

CMD tail -f /dev/null

You most likely won't need all the packages there and can adjust them according to your needs. Say, your docker image is called "cross-compile".
Run an instance of that image with:

Code:
docker run --device /dev/fuse --cap-add SYS_ADMIN --name cross-compile -it cross-compile bash

The build script provided by @sigmaris is based on azure pipelines. Based on that I created my own bash script that does the job

Code:
#!/bin/bash

# clone arm-trusted-firmware from ARM
git clone https://github.com/ARM-software/arm-trusted-firmware.git
cd arm-trusted-firmware
git fetch --all --tags
git checkout tags/v2.6-rc0

# Clean and Build arm-trusted-firmware bl31.elf
make realclean
make -j$(getconf _NPROCESSORS_ONLN) CROSS_COMPILE=aarch64-linux-gnu- PLAT=rk3399 bl31

# export environment variable BL31 containing the path to the built bl31.elf
export BL31=$(realpath build/rk3399/release/bl31/bl31.elf)

# change dir out of arm-trusted-firmware directory
cd ..

# clone u-boot branch and change dir into u-boot
#git clone --single-branch --branch ci-2020.07-rockpro64-v3 https://github.com/sigmaris/u-boot.git
#cd u-boot

git clone --single-branch --branch v2021.10 https://github.com/u-boot/u-boot.git
cd u-boot
git checkout v2021.10
git remote add u-boot-sigmaris https://github.com/sigmaris/u-boot.git
git fetch u-boot-sigmaris --tags
git merge --allow-unrelated-histories --no-edit u-boot-sigmaris/ci-2021.07-rockpro64-atf-master
git remote remove u-boot-sigmaris

# Clean and Configure U-Boot with rockpro64 default config
make mrproper
make rockpro64-rk3399_defconfig

# Build U-Boot
make -j$(getconf _NPROCESSORS_ONLN) CROSS_COMPILE=aarch64-linux-gnu-

# Make idbloader.img for MMC/SD
tools/mkimage -n rk3399 -T rksd -d tpl/u-boot-tpl.bin:spl/u-boot-spl.bin mmc_idbloader.img

# Make idbloader.img for SPI
tools/mkimage -n rk3399 -T rkspi -d tpl/u-boot-tpl.bin:spl/u-boot-spl.bin spi_idbloader.img

# Copy common env object and extract env data

#cp env/built-in.o built_in_env.o
#aarch64-linux-gnu-objcopy -O binary -j ".rodata.default_environment" built_in_env.o  
cp env/common.o common.o
aarch64-linux-gnu-objcopy --dump-section .rodata.default_environment=common.o env/common.o

# Replace null terminator in built-in env with newlines

#tr '\0' '\n' < built_in_env.o | sed '/^$/d' > built_in_env.txt
tr '\0' '\n' < common.o | sed '/^$/d' > built_in_env.txt

# Make common env image with correct CRC for MMC/SD
tools/mkenvimage -s 0x8000 -o mmc_default_env.img built_in_env.txt

# Make common env image with correct CRC for SPI
tools/mkenvimage -s 0x8000 -o spi_default_env.img built_in_env.txt

# Create staging directory
mkdir ../build

# Copy artifacts to staging directory

cp u-boot.itb ../build/mmc_u-boot.itb
cp mmc_idbloader.img ../build
cp mmc_default_env.img ../build

cp u-boot.itb ../build/spi_u-boot.itb
cp spi_idbloader.img ../build
cp spi_default_env.img ../build

# Build u-boot binaries
padsize=$((0x60000 - 1))

img1size=$(wc -c < "spi_idbloader.img")
[ $img1size -le $padsize ] || exit 1

dd if=/dev/zero of=spi_idbloader.img conv=notrunc bs=1 count=1 seek=$padsize

#dd if=/dev/zero of=mmc_idbloader.img conv=notrunc bs=1 count=1 seek=$padsize
#cat mmc_idbloader.img u-boot.itb > ../build/spi_combined.img

cat spi_idbloader.img u-boot.itb > ../build/spi_combined.img

tools/mkimage -C none -A arm -T script -d scripts/flash_spi.cmd ../build/flash_spi.scr
tools/mkimage -C none -A arm -T script -d scripts/erase_spi.cmd ../build/erase_spi.scr

# SPI flash and erase images

# Flash image

dd if=/dev/zero of=boot.tmp bs=1M count=16

mkfs.vfat -n SCRIPT boot.tmp

mcopy -sm -i boot.tmp ../build/flash_spi.scr ::
mcopy -sm -i boot.tmp ../build/spi_combined.img ::

dd if=/dev/zero of=../build/flash_spi.img bs=1M count=32

parted -s ../build/flash_spi.img mklabel gpt
parted -s ../build/flash_spi.img unit s mkpart loader1 64 8063
parted -s ../build/flash_spi.img unit s mkpart loader2 16384 24575
parted -s ../build/flash_spi.img unit s mkpart boot fat16 24576 100%
parted -s ../build/flash_spi.img set 3 legacy_boot on

dd if=../build/mmc_idbloader.img of=../build/flash_spi.img conv=notrunc seek=64
dd if=../build/mmc_u-boot.itb of=../build/flash_spi.img conv=notrunc seek=16384
dd if=boot.tmp of=../build/flash_spi.img conv=notrunc seek=24576

rm boot.tmp

# Erase image

dd if=/dev/zero of=boot.tmp bs=1M count=16
mkfs.vfat -n SCRIPT boot.tmp

mcopy -sm -i boot.tmp ../build/erase_spi.scr ::
mcopy -sm -i boot.tmp ../build/spi_combined.img ::

dd if=/dev/zero of=../build/erase_spi.img bs=1M count=32

parted -s ../build/erase_spi.img mklabel gpt
parted -s ../build/erase_spi.img unit s mkpart loader1 64 8063
parted -s ../build/erase_spi.img unit s mkpart loader2 16384 24575
parted -s ../build/erase_spi.img unit s mkpart boot fat16 24576 100%
parted -s ../build/erase_spi.img set 3 legacy_boot on

dd if=../build/mmc_idbloader.img of=../build/erase_spi.img conv=notrunc seek=64
dd if=../build/mmc_u-boot.itb of=../build/erase_spi.img conv=notrunc seek=16384
dd if=boot.tmp of=../build/erase_spi.img conv=notrunc seek=24576

rm boot.tmp

What does the script actually do?

We pull and build the arm trusted firmware (here: v2.6-rc0; you can adjust it to v2.5 if you want to). Other than the version, it's the same
as in the azure pipeline.

It pulls u-boot mainline and checks out to v2021.10. We then do a merge with sigmaris' changes (will only work if there are no conflicts).
And finally build u-boot.

This script will only work with the newest version though. There were some changes 
(see: https://github.com/u-boot/u-boot/commit/...8747a9d217)
which affect the following lines:

Code:
#cp env/built-in.o built_in_env.o
#aarch64-linux-gnu-objcopy -O binary -j ".rodata.default_environment" built_in_env.o  
cp env/common.o common.o
aarch64-linux-gnu-objcopy --dump-section .rodata.default_environment=common.o env/common.o

# Replace null terminator in built-in env with newlines

#tr '\0' '\n' < built_in_env.o | sed '/^$/d' > built_in_env.txt
tr '\0' '\n' < common.o | sed '/^$/d' > built_in_env.txt

If you want to use older versions, simply comment out my code and uncomment the lines above.

I don't have time to test it at the moment, I can't guarantee that the merge is compatible with the newest version of u-boot.
Nevertheless, I didn't have any errors during compilation and the images are also built successfully. 
You can find the images under "/home/user/build" inside your docker container
As I haven't tested them with the newest version, use at your own risk
  Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Enabling secure boot arturkow 4 405 11-25-2021, 04:57 AM
Last Post: arturkow
  Device unable to boot Lunix33 9 1,897 09-23-2021, 01:38 AM
Last Post: LMM
  Flashed SPI, able to ping RP64 but not boot from sdcard/usb korefuji 2 592 08-26-2021, 11:41 AM
Last Post: LMM
  Booting my RockPro64 from NVMe drive PsySc0rpi0n 3 1,508 05-02-2021, 01:56 AM
Last Post: LMM
  First boot - no fan, no display... is it broken? rantoie 1 809 04-28-2021, 04:41 PM
Last Post: rantoie
  PXE boot - unable to login LMM 2 653 04-22-2021, 02:59 PM
Last Post: LMM
  PCIe Armbian and Ayufan for nvme ssd LMM 1 1,099 02-03-2021, 04:38 PM
Last Post: LMM
  Camera sensor support on MIPI lines abhimanyu_gr 0 745 01-22-2021, 09:23 AM
Last Post: abhimanyu_gr
  unable to boot from SDCard Charles Tarkowski 4 2,102 12-16-2020, 11:11 AM
Last Post: Charles Tarkowski
  Rockpro64 boot issues larsgn 14 10,353 10-21-2020, 07:54 AM
Last Post: quinterro

Forum Jump:


Users browsing this thread: 3 Guest(s)