Skip to content

arch: arm: protect r0 in z_arch_switch_to_main_thread() inline ASM #20094

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

ioannisg
Copy link
Member

Adding r0 to the clobber list in the inline ASM block of
z_arch_switch_to_main_thread(). This instructs assembler
to not use r0 to store ASM expression operands, e.g. in
the subsequent instruction, msr PSR %1.

Signed-off-by: Ioannis Glaropoulos [email protected]

Fixes #16900

@ioannisg
Copy link
Member Author

This has not been shown to be a bug with GCC. But IMHO, it is worth the fix; it does not seem to have a compile- or run-time cost. GCC will just pick any other low register to store the stack pointer value, and pass it to PSP.

@ioannisg ioannisg added this to the v2.1.0 milestone Oct 24, 2019
@zephyrbot zephyrbot added the area: ARM ARM (32-bit) Architecture label Oct 24, 2019
@ioannisg ioannisg requested a review from pabigot October 26, 2019 13:16
Copy link
Collaborator

@pabigot pabigot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, r0 does need to be in clobber. Technically r1, r2, and r3 should be listed as clobbers too. (It's not necessary only because there's nothing reachable after the asm statement.)

I believe that if you move the unconditional movs r1, #0 on line 446 up to line 434 you can eliminate the instruction from line 440, which saves 4 bytes.

Adding r0 to the clobber list in the inline ASM block of
z_arch_switch_to_main_thread(). This instructs assembler
to not use r0 to store ASM expression operands, e.g. in
the subsequent instruction, msr PSR %1.

We also do a minor optimization with the clearing of R1
before jumping to main.

Signed-off-by: Ioannis Glaropoulos <[email protected]>
@ioannisg ioannisg force-pushed the arch_arm_switch_to_main_protect_r0 branch from 9a7c6fb to 0b6a48b Compare October 28, 2019 06:11
@ioannisg
Copy link
Member Author

Yeah, r0 does need to be in clobber. Technically r1, r2, and r3 should be listed as clobbers too. (It's not necessary only because there's nothing reachable after the asm statement.)

I believe that if you move the unconditional movs r1, #0 on line 446 up to line 434 you can eliminate the instruction from line 440, which saves 4 bytes.

Done, thanks for reviewing this @pabigot .

@galak galak merged commit 03734ee into zephyrproject-rtos:master Oct 29, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: ARM ARM (32-bit) Architecture
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Inline assembly in Arm z_arch_switch_to_main_thread missing clobber list
6 participants