Load testing tools/strategies
# help-with-other
j
Looking to introduce load testing more on our sites, but it's something we do very little atm, so was hoping anyone here would have some experience to share ๐Ÿ™‚ Been testing out k6.io as we'd want something that's quick to set up and easy to maintain. The ideal would be that we could continuously test a project while it's being developed and thus be able to catch poorly performing sections as they are made.
r
I have used k6 in the best to test some issues discovered in live and thought it was great. For the project I had we made some scripts to target specific actions and it was the responsibility of the dev to run them when necessary (mainly because we would need to spin up the site to run the tests). They do have GitHub actions available but youโ€™d need to have your project running somewhere - https://github.com/grafana/run-k6-action Iโ€™ve also used LoadImpact in the past which can run concurrent tests from around the globe, but that can quickly start getting expensive, especially if you want to regularly run tests. In fact checking myself I also found that K6 have a desktop app for building and running tests which looks pretty sweet for GUI guys like me!
j
> Iโ€™ve also used LoadImpact https://loadimpact.com/ ๐Ÿ˜… K6 still has all those features (TBF, it was much more than just a rebrand, it's technically a very different and much more feature complete product now). > and thus be able to catch poorly performing sections as they are made. Load testing itself won't really help you with this. Load testing tells you what load your site, on given infrastructure, can handle. Great if you want to work out what infra you need to scale before a client's massive email send, or understanding more general performance characteristics under load, but not much else. Sure you can tell from a load test if a site is not performant, and you may be able to narrow down particular problem URLs, but to check for poorly performing code it's better to profile your site and literally see where time is spent. By all means use something like K6 to generate load, but the key is in profiling using something like dotTrace. If I'm working on a performance-critical project, I'll be asking at code review "I see you've introduced this new method - how many microseconds does that take to run on your machine". That's right - I'd expect every dev to be micro-profiling every new view, method etc. so that they know how many milli/micro/nanoseconds of CPU time they're adding to the project. It's so quick and easy to do with just
System.Stopwatch
and
Console.Writeline
or miniprofiler while we're testing our code and will head off performance problems before they become a problem.
r
The performance king has spoken! ๐Ÿ’– That's an interesting take on the individual PR performance and sounds great when you know it's going to be a performance-critical project out of the gate. In my experience it's been a project that has gone live and needs some performance updating afterwards.
j
Yes, this is most projects IME too, though I get the impression Jesper's talking about building something new here. For general performance improvements on an existing site it's all about the telemetry for me and I'm a huge fan of application insights for that. Can then look for requests that are slower than they should be, under real-world conditions and drill down from there - there's also a really nice Code Optimizations tab that will flag places where it thinks too much CPU or RAM is being used. Load testing with that in place will then help flag certain kinds of bottlenecks but there shouldn't be any surprises in terms of general performance that you can't see with regular real-world/production traffic.
j
I understand my question was a bit all over the place ๐Ÿ˜„ We generally want to have a better understanding and some specific benchmarks for loadtimes on our sites, been looking at something a bit generic that could be rolled out on many projects to have an idea of performance. I agree that profiling the solution is the best way, but that is also quite time consuming and will be specific to each site. If we can run something like k6 from the beginning of a project with specific load time requirements for x amount of users on the same infrastructure then it is an easy to set up benchmark we could roll out pretty quickly. Not nessecarily looking to hyper-optimize a bunch of sites, but more catch terrible performance in certain areas ๐Ÿ™‚ But it sounds like the consensus is k6 is the way to go, so will get that running - and potentially look into something like dotnet benchmark for critical functionality
e
Might be late to the party but weโ€™ve used JMeter with Azure Load Testing. It works really well until you start doing API simulations for >100 simultaneous runs. Upside is you get to port the JMeter scripts across environments and azure produces a report that makes some sense to non tech stakeholders. It also costs to run that service l, so if budget is capped itโ€™s probably not the best solution. Maybe Grafana paired with K6 might be your best bet?
25 Views