wolfSSL release 1.5.4 adds support for the Mongoose Web Server and intel`s AES-NI encryption instructions, speed improvements for the SHA1 cipher from loop unrolling, new testing certificates and bug fixes.
Support for the Mongoose Web Server was user requested. Support was added by test building Mongoose with wolfSSL and enabling our OpenSSL compatibility API. Minor functionality was then added to wolfSSL`s OpenSSL compatibility API to make the build seamless. The build results in a smaller footprint version of Mongoose for embedded users. Secure connections to Mongoose were then tested and subsequently passed. If you find any issues in building Mongoose with wolfSSL or with the subsequent builds, then please contact us at email@example.com.
Support for AES-NI, intel`s new instruction set for accelerated AES support, was enabled so that our intel based users could access the instruction set directly through wolfSSL. This new functionality gives wolfSSL users wonderful speed improvements when using AES on servers under heavy load. More details on how to make the most of wolfSSL`s AES-NI support for intel servers will be available shortly. If you need more help now, then let us know at firstname.lastname@example.org. Details on AES-NI can be found here: http://www.intel.com/Assets/en_US/PDF/whitepaper/Intel_AES-NI_White_Paper.pdf.
Happy yaSSLing! We hope you enjoy this new release!
yaSSL, the leading C++ based ssl library, is now available for download. This release of yaSSL contains bug fixes, new testing certs, and a security patch for a potential heap overflow on forged application data processing. Vulnerability discovered by Matthieu Bonetti from VUPEN Security (http://www.vupen.com).
yaSSL has now been partnered with ARM for 18 months! Why is yaSSL partnered with ARM? Simply stated, because so many of our users and customers are running on the ARM chipset. Users choose wolfSSL on ARM because it is fast, lightweight, and easily embedded to secure connected application software. To meet the needs of our customers, we have ported the wolfSSL embedded ssl library to several ARM based environments, including Android, ThreadX, QNX, Ubuntu, IAR, MontaVista and OpenWRT. If you need wolfSSL on another ARM based operating environment that we don’t currently support, then contact us at email@example.com.
We recently announced a beta version of secure Memcache is available to the community. Users are excited about having SSL available in memcache, but show concern about the performance cost of enabling security. With this feedback in mind, we benchmarked secure memcache performance against standard memcache. See the graph below for comparison of secure memcache to standard memcache by measure of TPS. The first bar is regular memcache. The rest of the bars show secure memcache running with different cipher options. We’ve also included the performance of “direct to database,” or running without memcache at all, as a baseline for comparison.
We have several additional optimizations in the works that will bring the performance of secure memcache to within 5-10% of standard memcache for most environments, and can also provide network, operating system and hardware specific optimizations for specific users.
You will see from the graph that running Memcache with SSL enabled is 4x faster than running direct to database. The cost of running securely varies between about 25% and 40%, depending on cipher. There is some cost to running secure memcache, but it is not unbearable. Email us at firstname.lastname@example.org if you want more details on the benchmark.
Who should use secure Memcache? Generally speaking, anyone concerned about the security of their memcache data.
1. Users with regulatory compliance requirements.
2. SaaS companies hosting sensitive customer and user data who cannot risk a breach that could damage their reputation
3. Users running in the cloud.
4. Users concerned with masking memcache data securely within their firewall.
Contact us at email@example.com if you are interested in receiving the beta or for more information.