© 2026 Hedgehog Software, LLC

TwitterGitHubDiscord
More
CommunitiesDocsAboutTermsPrivacy
Search
Star
Setup for Free
C#C
C#•3y ago•
30 replies
Tacti Tacoz

✅ What is the deal with default jit optimizations

According to my memory debugger
interface ITest1
        {
        }
        interface ITest2 { }

        struct Test1 : ITest1
        {
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static bool test1<T>(ref T value)
        {
            return value is ITest1;
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static bool test2<T>(ref T value)
        {
            return value is ITest2;
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static object Test<T>(ref T value)
        {
            bool flag = true;
            int x = 0;
            int y = 0;
            while (flag)
            {
                if (test1(ref value))
                    x++;
                if (test2(ref value))
                    y++;
            }
            return null;
        }
interface ITest1
        {
        }
        interface ITest2 { }

        struct Test1 : ITest1
        {
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static bool test1<T>(ref T value)
        {
            return value is ITest1;
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static bool test2<T>(ref T value)
        {
            return value is ITest2;
        }
        [System.Runtime.CompilerServices.MethodImplAttribute(System.Runtime.CompilerServices.MethodImplOptions.NoInlining)]
        static object Test<T>(ref T value)
        {
            bool flag = true;
            int x = 0;
            int y = 0;
            while (flag)
            {
                if (test1(ref value))
                    x++;
                if (test2(ref value))
                    y++;
            }
            return null;
        }

The generic value given to test1 and test2 gets boxed at every iteration.
Now in a debug senario this isn't surprising given that the emited il for this is indeed boxing.
But I thought boxing like this are suppose to be optimized out by default in a release build.

To make this code get optimized like it should be "AggresiveOptimization" flag has to be manuelly used.

My project settings are standard .net 7 settings.

Unless I'm missing something this seem rarther odd.
Infact the implementation of hot and common code paths like GenericEqualityComparer depend on semilare optimization to happen. Obviously that is getting optimized otherwise simple collection lookups would blow the GC up on a regular basis.

Having optimizations disabled by default is
surely not the intended behavior here?
Surely my memory debugger are making some pretty heavy assumptions without actuelly analyzing memory allocations correctly right?

Thanks in advance for any insight you guys can give.
C# banner
C#Join
We are a programming server aimed at coders discussing everything related to C# (CSharp) and .NET.
61,871Members
Resources
Was this page helpful?

Similar Threads

Recent Announcements

Similar Threads

Compiler optimizations with generics
C#CC# / help
2y ago
✅ Disassemble JIT compiled code with source lines?
C#CC# / help
8mo ago
❔ What is the problem with this code?
C#CC# / help
3y ago
❔ Is something like this possible with the default DI package?
C#CC# / help
3y ago