DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Enterprise AI Trend Report: Gain insights on ethical AI, MLOps, generative AI, large language models, and much more.

2024 Cloud survey: Share your insights on microservices, containers, K8s, CI/CD, and DevOps (+ enter a $750 raffle!) for our Trend Reports.

PostgreSQL: Learn about the open-source RDBMS' advanced capabilities, core components, common commands and functions, and general DBA tasks.

AI Automation Essentials. Check out the latest Refcard on all things AI automation, including model training, data security, and more.

Related

  • Performance Optimization in Agile IoT Cloud Applications: Leveraging Grafana and Similar Tools
  • Essential Techniques for Performance Tuning in Snowflake
  • Essential Relational Database Structures and SQL Tuning Techniques
  • Fine Tune Your Salesforce Queries

Trending

  • Being a Backend Developer Today Feels Harder Than 20 Years Ago
  • Modern Digital Authentication Protocols
  • Implement RAG Using Weaviate, LangChain4j, and LocalAI
  • How to Query XML Files Using APIs in Java
  1. DZone
  2. Software Design and Architecture
  3. Performance
  4. Eager Optimization Is The Enemy

Eager Optimization Is The Enemy

This article takes a stand against eager optimization, and explains why it should be left to the wayside. Which side are you on?

By 
Sam Atkinson user avatar
Sam Atkinson
·
May. 20, 16 · Opinion
Like (22)
Save
Tweet
Share
13.5K Views

Join the DZone community and get the full member experience.

Join For Free

I am officially declaring war on the eager optimization crowd. You may be part of it and not even know it. Let’s take this comment from a DZone article I posted a while back:

“But if you already know something is slow and how to write it correctly then that is not premature optimisation, that is smart coding. “

See, this guy thinks he isn’t part of the eager optimization crowd. Because he doesn’t do it, unless he knows that a certain method or algorithm is “slow” and needs to be written performantly. Then he’ll write it performantly from the off.

This is eager optimization dammit.

This is what results in codebases with outrageously complicated code that can be replaced with a for-loop. Because you “know better”. I’m here to tell you, you don’t.

Measure everything

Let’s go back to the start. When you’re writing a piece of code, whatever it is, you will have performance concerns. Perhaps it’s part of a batch job that runs overnight, in which case it can probably go as slow as you want. Counterpoint, it could be a dependency as part of a super low latency system. Or maybe it’s returning a response to a user on a webpage.

These are three very differing requirements. The important thing is that for each of them you can attach some sort of numbers to it:

  • Total Batch needs to complete in under 7 hours

  • Super Low Latency system needs a response in under 1ms

  • Webpage needs a response in under 300ms

The beauty of this is that you have an actual number to target. This means you can empirically prove if you’re doing the job or not. When people start waffling to me about needing a complex algorithm for a piece of work to go fast enough, I can simply say one thing:

Prove it.

There are genuinely times when you need your outrageously complex algorithm from a textbook. But these are rare. Normally a bunch of for-loops and/or hashmaps will do the job perfectly well. The inbuilt Java algorithms for searching aren’t bad.

I’m a massive proponent of TDD. TDD dictates to implement the simplest, most stupid thing to get the test to pass and then go from there. This is absolutely how you should approach any potential complexity of your code.

First, write the simplest, cleanest thing you can to make it functional. Cleanliness is the most important goal in code (along with ensuring it works). Clean, well tested code results in long term maintainable systems. Use the built in Java libraries where possible, as the consumer of your code will understand these at a minimum, and as I mentioned before, they’re usually pretty good and certainly well tested.

Now, test the performance. Actually, measure it against your target. In my experience the majority of the time you’ll hit your measure and can move on with your life.

Now I can almost guarantee the outrage in the comments. And whilst I look forward to reading them, I encourage the rest of you to give this a go. Don’t optimize anything at the beginning of your coding. Test. Improve.

The other benefit is that your tests will be even nicer. By using simple implementations it will help you generate a clean API for your consumers (whether that’s yourself or a third party). If you do need to improve your code (and that’s totally ok!) you’ve got a full suite of working tests you can ensure you conform to, to guide you through the process.

The biggest culprit for me is in search. I’m not talking about massive datasets (which would probably be in an appropriate dataset), but instead in memory- things like lists and maps that are stored intra-application. The amount of times I’ve seen people implement their own overly complex caching algorithms on top which are hard to understand and often don’t work makes me cry. Usually, they can be ripped out and replaced with a for loop with no detriment (and often with performance gain).

If you really do need a performance boost then never forget there are people out there who are smarter than you who have implemented this stuff and have been tested by the community it. It’ll be easier to understand and a lot more likely to work. Google Collections are your friend. You don’t need to reinvent the wheel in an attempt to be more performant. No one will thank you for it.

optimization

Opinions expressed by DZone contributors are their own.

Related

  • Performance Optimization in Agile IoT Cloud Applications: Leveraging Grafana and Similar Tools
  • Essential Techniques for Performance Tuning in Snowflake
  • Essential Relational Database Structures and SQL Tuning Techniques
  • Fine Tune Your Salesforce Queries

Partner Resources


Comments

ABOUT US

  • About DZone
  • Send feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: