PHP vs. Java are there energy consumption differences?

I heard a rumor, that Java consumes less energy than PHP and was wondering if and how this would be true. I'm currently working in a company where we base most of our apps on PHP. Power consumption has never been the problem for us, but we are working on bigger projects where it might matter. We love PHP for web developing and are wondering how such a rumor can spread and if it is true at all.

The example I heard was that Facebook is switching for exactly that reason to Java (I can't seem to find any of this stuff on google though).

Since a customer of mine is asking me this question I would love proof if it is true.


Solution 1:

Computers don't particularly care if they're executing Java or PHP. Power consumption is pretty much the same. The question then becomes a question of performance - if you can serve more requests with one server you'll need less servers and consume less power. Or alternatively, if you're not doing web scale applications, serve your requests quicker and spend more time idling, which consumes less power.

Given pure Java and pure PHP, Java as a statically typed JIT'ed language is of course faster. The question is rather which one can you make faster given the team members and development effort available to you.

My take is that the best way is to mix languages, use existing Java based infrastructural tools, such as Terracotta to build the performance critical parts and something more nimble to build complex but not that heavy business and presentation logic.

Solution 2:

I really really doubt it is a language only issue.

The platforms in question have so much variability to render any generic comparison moot. To name a few variability points.

  • Servlet container for Java (Tomcat, Glassfish, Websphere, Jetty, ...)
  • Web server for PHP (Apache, IIS, lighttpd, nginx, ...)
  • Opcode caches for PHP
  • Libraries and framewokrs used
  • Operating systems
  • Hard disks involved
  • Cooling
  • Algorithms on the application itself

I really doubt you can isolate so many variables into a useful metric. At most you can pick two equivalent applications (noting all the platform's choices) using the same hardware and compare them. Then improve the worst until it tops the better one. The proper measurement would be both the watts per hour and requests per second, I think.

What's noted in Ants' answer (upvote him) is the crucial point though: the better performing platform will always be more power efficient, given enough demand, because it'll be able to serve the same amount of requests with less hardware.

But which platform is better performing is not merely language dependent, depends on the things noted above (and some more).

Solution 3:

I was surprised by this question until I did a Google search, which turned this up. It's a serious issue, one that I wouldn't have thought of.

Now that I am thinking about it, I think it becomes an issue of who pays the electric bill. Sun's Java strategy was about selling servers: big iron for the back, thin clients for the front end.

Perhaps technologies like Flex move more of the work back to the client and leave them with a greater percentage of the electric bill.

But I'd be surprised to see a ranking of languages by energy use.

A very interesting question. I'm voting it up.

What a fascinating problem. Wouldn't it be interesting to write an application in a number of languages, deploy them on identical hardware, and measure the power consumption? I'd love to see it.

If you really get crazy, what about a performance monitoring tool that, in addition to showing you where memory and CPU were consumed in each part of your app, would also show you where the most energy was being used?

Now I wish I could vote this question up again.

Solution 4:

Like many comparative questions here, you'll probably need to come up with a benchmark to really determine whether that's true.

lesswatts.org has a bit of information on applications power management, as well as several other aspects of power consumption on Linux systems. As a side note, they seem to be using PHP, so that might be worth something in itself :)

They keep repeating that you should use PowerTOP to determine which applications are causing the most power consumption, and you can see from the screenshot that they are checking wakeups from idle, at least.


Most of the time, a web server is sitting idle, then it "serves" for a very brief moment, then it goes back to waiting for the next connection to serve. In that regard, PHP would contribute very little to the entire process: only the serving portion. So I wonder if the actual benchmark was a comparison of a particular Java-based web server vs. Apache/PHP serving similar pages. If that was the case, it's not really a fair comparison of PHP and Java -- either of the two considered as serving an actual page is only active for milliseconds at a time, typically. I would think the actual web server, the one who's selecting or polling connections, is the one that would be hogging power.