[SOLVED] Decryption bootloader

Hi everyone,

I'd like to develop a bootloader that can decrypt Arduino code, load it into memory and launch it.

For now I followed the application note of Atmel "Safe and Secure Bootloader". I managed to compile the bootloader as well as the associated tests.

I managed to install bootloader and run Atmel's tests but when I try to load an Arduino binary, it doesn't work. Maybe I did not take the problem by the right end or then there's something I've done wrong.

Thank you in advance for your answers.

Please explain exactly what you mean by "it doesn't work".

Please post a link to the app note. Use the chain links icon on the toolbar to make the link clickable.

Thanks for your answer,

This is link to the Atmel application note and associate source code :
Application Note
Source code

I followed this Note and loaded two exemples given by Atmel and it worked (leds connected on port 17, 20 and 21 blinked).

But when I try to load Arduino compiled binary and execute it with the custom bootloader it doesn't work.
This compiled binary is a simple blink given by Arduino exemples. So when it upload and launch, LED connected on port 13 should blink but it doesn't.

I never tried to make a bootloader for decryption and I guess this is something very complex. There might be a simpler method to ensure a safe transfer of a new software.

Since each Sam3x8e has its own Unique Identifier (128 bits), prior to deliver a new version of a software for a device you sold with a Sam3x8e, you will be using the different parts of its Unique Identifier at several check points of the software, then release a binary file. Of course the software will have a Unique Identifier reader in its setup().

The customer will upload the binary successfully only with the device you previously sold and any decompiler could only give glimpses of the actual code.

Thanks for your answer ard_newbie,

The problem is I don't want to send a simple binary to customers and I'm realy interested by it.

I've some good notions in assembly and for now I just try to launch an application based at address 0x88000.

void _binExec (void * l_code_addr)
{
	__asm (	"mov	r1,	r0			\n"
			"ldr	r0,	[r1, #4]	\n"
			"ldr	sp, [r1]		\n"
			/*"msr	msp, [r1]		\n"*/
			"blx	r0"
	);
}

unsigned int main (void)
{
	void * vAppStartAddress = (void *)((unsigned int) 0x88000);
	_binExec(vAppStartAddress);
	return (unsigned int) vAppStartAddress;
}

So I load this code at address 0x80000 (start of flash), set GPNVM1 (boot from flash), and load my app at address 0x88000.
It's very simple but I think I don't launch my app correctly.

I don't understand why you are not using l_code_addr in _binExec().

I would try this:

void _binExec(void *code_addr)
 {
     void (*pFct)(void) = NULL;
     /* Point on __main address located in the second word in vector table */
     pFct = (void (*)(void))(*(uint32_t *)((uint32_t)code_addr + 4));
     pFct();
 }

Edit : I Wonder whether the vector table should be relocated ?

I use l_code_addr. When you use asm module first parameter is store in r0 second in r1 ect...

My asm code do exactly same thing as your C code.

This is a pseudo code of mine :

r0 = (void *) 0x88000
r1 = r0
r0 = r1[4]
sp = *r1
jump(r0)

sp is the stack pointer (r13).

If vector table needs to be relocated, see this thread, #reply 14:

http://www.avrfreaks.net/forum/change-linker-address-selfmade-bootloader

I don't really know how vector table works but my code is like this now :

void _binExec (void * l_code_addr)
{
	__asm (	"mov	r1,	r0			\n"
			"ldr	r0,	[r1, #4]	\n"
			"ldr	sp,	[r1]		\n"
			/*"msr	msp, [r1]		\n"*/
			"blx	r0"
	);
}

unsigned int main (void)
{
	void	*vAppStartAddress = (void *)((unsigned int) 0x88000);
	int		i;

	// -- Disable interrupts
	// Disable IRQ
	__disable_irq();
	// Disable IRQs
	for (i = 0; i < 8; i ++)
	{
		NVIC->ICER[i] = 0xFFFFFFFF;
	}
	// Clear pending IRQs
	for (i = 0; i < 8; i ++)
	{
		NVIC->ICPR[i] = 0xFFFFFFFF;
	}

	// -- Modify vector table location
	// Barriars
	__DSB();
	__ISB();
	// Change the vector table
	SCB->VTOR = ((uint32_t)vAppStartAddress & SCB_VTOR_TBLOFF_Msk); // = 0x88 0000
	// Barriars
	__DSB();
	__ISB();

	// -- Enable interrupts
	__enable_irq();

	_binExec(vAppStartAddress);
	return (unsigned int) vAppStartAddress;
}

Test programs given by Atmel still blink but Arduino binary doesn't blink.

See this thread too, reply #2:

http://forum.arduino.cc/index.php?topic=311803.0

My LED don't blink anymore.

I don't understand why he modify the LENGTH. If flash.ld file is write correctly we have to change ORIGIN to 0x88000 and LENGTH to 0x80000 - 0x8000.

Edit : Change ORIGIN works my LED blink. Thanks for your help ard_newbie :wink:

Congrats ConstantBrunet !

Edit your post and prefix its title with [SOLVED]