As SOA applications grow in scale, some enterprises find that having Web services pull data from disk-based databases introduces too much latency. Some enterprises address this issue by giving their services a dedicated caching layer, said Amit Pandey, CEO of application and data scalability provider Terracotta.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
On Tuesday, the company released version 2.1 of its open source Ehcache distributed caching software for Java, which it acquired in August of 2009. Pandey said the focus of this release has been adding in more enterprise features like performance monitoring, configurable SLA parameters and improved WebSphere support.
Developers can use Ehcache for caching in either a single-node or in a distributed mode. In version 2.1, Ehcache gained the ability to cache and scale IBM WebSphere application server sessions. It also has improved Java Transaction API support and a plug-in that monitors cache performance.
Due to a rise in popularity of multi-core servers and virtualization, many enterprises are taking a closer look at caching, said Pandey. If you’re running 64 application instances from a single box and they are all accessing the same database, he said, response time can suffer.
“At that point, you do need something in-memory very close to the app to prevent the latency and traffic issues that you’re going to get on the database,” Said Pandey. “The last thing developers want to have to do is go in and start hacking at their code and making changes to the business logic.”
In-memory caching and distributed caching let the data live closer to services so no time is wasted reading and writing to a disk. This way, instead of tweaking the way applications handle data, you can just make the data more available.