> Iโve also used LoadImpact
https://loadimpact.com/ ๐
K6 still has all those features (TBF, it was much more than just a rebrand, it's technically a very different and much more feature complete product now).
> and thus be able to catch poorly performing sections as they are made.
Load testing itself won't really help you with this. Load testing tells you what load your site, on given infrastructure, can handle. Great if you want to work out what infra you need to scale before a client's massive email send, or understanding more general performance characteristics under load, but not much else.
Sure you can tell from a load test if a site is not performant, and you may be able to narrow down particular problem URLs, but to check for poorly performing code it's better to profile your site and literally see where time is spent.
By all means use something like K6 to generate load, but the key is in profiling using something like dotTrace.
If I'm working on a performance-critical project, I'll be asking at code review "I see you've introduced this new method - how many microseconds does that take to run on your machine". That's right - I'd expect every dev to be micro-profiling every new view, method etc. so that they know how many milli/micro/nanoseconds of CPU time they're adding to the project. It's so quick and easy to do with just
System.Stopwatch
and
Console.Writeline
or miniprofiler while we're testing our code and will head off performance problems before they become a problem.