Ensuring 16-bit THUMB Instruction Encoding for Code Execution from SRAM on STM32F103

I'm working on a project where I want to execute code from internal SRAM on an STM32F103 as an exercise. My goal is to write some THUMB assembly by hand, assemble it using arm-none-eabi-as, load the machine code into SRAM with OpenOCD's mwh command, set the PC to the beginning of SRAM using reg pc 0x20000000, and step through the instructions.

The assembly code I wrote is a simple loop:

.thumb
.syntax unified

mov r0, #40
mov r1, #2
add r2, r0, r1
mvn r0, #0x20000000
bx r0


Here are the commands I used to assemble and disassemble the code:

$ arm-none-eabi-as -mthumb -mcpu=cortex-m3 -o main.o main.S
$ arm-none-eabi-objdump -d -m armv7 main.o


The disassembler output is as follows:

main.o:     file format elf32-littlearm

Disassembly of section .text:

00000000 <.text>:
   0:   f04f 0028   mov.w   r0, #40 ; 0x28
   4:   f04f 0102   mov.w   r1, #2
   8:   eb00 0201   add.w   r2, r0, r1
   c:   f06f 5000   mvn.w   r0, #536870912  ; 0x20000000
  10:   4700        bx  r0


My understanding is that THUMB instructions should be 16 bits long, but the disassembler is showing some instructions (like mov.w r0, #40 as f04f 0028) that are 32 bits. I was expecting 16-bit instructions since I am working with THUMB.

Why are some of my instructions encoded as 32-bit THUMB-2 instructions, and how can I ensure that the machine code is 16-bit THUMB-only instructions for loading into SRAM?
Was this page helpful?