Four Small Java/Coffee Tips That Result in Better Performance (and Better Taste)

By: William Phelps | Senior Technical Architect


It’s really no secret that it’s little things that often yield the biggest results.

As you are reading this article, chances are that you are perhaps drinking a cup of coffee. There are quite a few small things about coffee that you may or may not know.

  • – Simply adding cream to your coffee will keep the coffee warmer about 20 percent longer. This occurrence is similar in nature to the effect that allows warm water to be frozen into ice cubes faster than cold water. Try it and see.
  • – Adding a small pinch of salt to your coffee will cut down on the acidity of the coffee, and results in a much smoother cup of coffee. Add the salt to the coffee pot if you brew by the pot, or to your cup if you like a single-serve variety.  (I personally use kosher salt for this trick, not regular table salt… you’ll likely use too much with the table salt.)
  • – Coffee grounds are an excellent fertilizer source for house plants. I’d imagine the amount found in a regular single-serving cup or Keurig “K” cup would be ideal for small potted plants.
  •  – The first webcam was implemented by Cambridge University to monitor a coffee pot. The coffee was disappearing very quickly, so a webcam was used to monitor when the pot was finished brewing so people could get a cup.

Rather than going to the extreme of putting a camera on a pot, some folks opt to go out for their coffee fix.  Have you ever noticed that the cup of coffee you buy in-store at Starbucks tastes so much better when brewed in the shop, than from the same bag of grounds that you bought in the shop, and then took it home and brewed it yourself?  This is likely due to a handful of reasons, but the biggest reason is probably the tuning of the exact process that the average barista follows.

The very same tuning principle applies to your Java program. “Compile once, run anywhere” code is still very dependent on the environment in which it’s deployed. Think of Java as the coffee whereas the Java Virtual Machine (“JVM”) is the coffee pot that comes in the Java Development Kit (“JDK”). A better “pot” and the process of handling said pot will result in more consistent results. There are a lot of “coffee pot“ manufacturers but some basic setups are all the same.

  • –  “Avoid installing Java into a file system location with spaces in the path.”Primarily a Windows issue, the Windows Java installer suggests a default installation path in the “Program Files” directory. This is a problem for many programs when your programs start looking for jars to add in the class loader. Unless the program was coded to wrap the classpath in quotes, your program will fail in very odd ways that is hard to debug. While this is predominantly a Windows problem, the same thing can happen on other operating systems.  A coffee pot will have a proper storage “space” in your home/office. Ditch the space however in your install paths.
  • –  “Install the JVM/JDK into a generic path location.”While this may seem counterintuitive, once the basic installation is done, the process of updating the JVM/JDK in the future becomes very simple. Seeing the version of the JDK in a file path is a weird comfort for some people, but having to edit numerous files to update the location reference is fraught with issues, and in some cases may be impossible.  It’s simply easier for other folks to find the “working” coffee pot if it’s always stored in the same place. Multiple pots can exist on a server, but this approach makes it clearer which pot is being used.
  • –  “Change/update the random number generator source.”Have you ever got tired of waiting for the coffee pot to heat up? Sometimes a JVM is really sluggish in performing because of insufficient entropy on a server. This is a somewhat complex topic, but in essence, some operating systems rely on basic I/O to generate input for random number generation. If the generation is slow, and multiple processes are waiting for a number, your program can seemingly hang, when in reality it’s just waiting in the queue.

There is a small change that can be made in the JVM to change the random number generation process to look at another source. In the jdk’s jre/lib/security folder, find the file. Search for the line that references “secure.random”.

Add a . to the setting as shown, and restart your processes that use the JVM.

This trick has been shown to significantly improve startup times in WebLogic server.

  • – “Reinstall any certificates from the old JVM to the new JVM.”Finally, if the coffee pot is getting an upgrade, some of the “attachments” may still be needed. This is true of certificates that may have been installed in the cacerts file of the old JVM. Before upgrading the JDK make a copy of the existing cacerts file. Then you can reimport the certificates by basically merging the deltas from the old cacerts file into the new version.

This command will only insert/import certificates that exist in the old cacerts file, but not in the new cacerts file. This is really handy when it’s not known which exact certificates have changed over time.

It’s the little things that make both a smooth cup of coffee and a smooth-running JVM.


Want more Java tips? Contact us today!

TekStream Recognized in 2021 Splunk Global and Regional Partner Awards

TekStream Named 2021 Global Services Partner of the Year and AMER Professional Services Partner of the Year for Outstanding Performance


TekStream today announced it has received the 2021 Global Services Partner of the Year and 2021 AMER Professional Services Partner of the Year awards for exceptional performance and commitment to Splunk’s Partner+ Program. The 2021 Global Services Partner of the Year Award recognizes a partner with excellence in post-sale and professional services implementations. This partner demonstrates a strong commitment to technical excellence, certifications, and customer satisfaction. The 2021 AMER Professional Services Partner of the Year Award recognizes an AMER Splunk partner that is actively engaged in services implementations, in addition to having a strong commitment to training and certification of their organization. For more information on Splunk’s Partner+ Program, visit the Splunk website.

“We are delighted to have won the 2021 Global Services Partner of the Year and 2021 AMER Professional Services Partner of the Year awards. It is a fantastic achievement to be awarded and even more satisfying to contribute to the success of Splunk and our customers. Our team is very excited to be recognized for its efforts and expertise and will wear this prized recognition proudly,” said Matthew Clemmons, Managing Director at TekStream.

“Congratulations to TekStream for being named the 2021 Splunk Global Services Partner of the Year and 2021 AMER Professional Services Partner of the Year,” said Bill Hustad, VP, Global GTM Partners, Splunk. “The 2021 Splunk Global Partner Awards highlight partners like TekStream that deliver successful business outcomes, as well as help our joint customers leverage Splunk’s Data-to-Everything Platform to drive value and unlock insights. Additionally, TekStream shares our commitment of prioritizing customer success.”

The Splunk Partner Awards recognize partners of the Splunk ecosystem for industry-leading business practices and dedication to constant collaboration. All award recipients were selected by a group of Splunk executives, thought leaders, and the global partner organization.

“We are very honored to have been selected by Splunk for not just one, but two Partner of the Year awards. TekStream prides itself on doing what is right for the customer above all else, and our commitment to that mantra drives everything that we do. We value our partnership and look forward to helping Splunk grow the ecosystem on its way to $5B,” said Karl Cepull, Senior Director, Operational Intelligence at TekStream.

About TekStream

TekStream accelerates clients’ digital transformation by navigating complex technology environments with a combination of technical expertise and staffing solutions. We guide clients’ decisions, quickly implement the right technologies with the right people, and keep them running for sustainable growth. Our battle-tested processes and methodology help companies with legacy systems get to the cloud faster, so they can be agile, reduce costs, and improve operational efficiencies. And with 100s of deployments under our belt, we can guarantee on-time and on-budget project delivery. That’s why 97% of clients are repeat customers. For more information visit

JSON Structured Data & the SEDCMD in Splunk

By: Khristian Pena | Splunk Consultant



Have you worked with structured data that is not following its structure? Maybe your JSON data has a syslog header. Maybe your field values have an extra quote, colon, or semicolon and your application team cannot remediate the issue. Today, we’re going to discuss a powerful tool for reformatting your data so automatic key-value fields are extracted at search-time. These field extractions utilize KV_MODE in props.conf to automatically extract fields for structured data formats like JSON, CSV, and from table-formatted events.



KV_MODE = [none|auto|auto_escaped|multi|json|xml]

This article will focus on the JSON structure and walk through some ways to validate, remediate and ingest this data using the SEDCMD.  You may have used the SEDCMD to anonymize, or mask sensitive data (PHI,PCI, etc) but today we will use it to replace and append to existing strings.


JSON Structure

JSON supports two widely used (amongst programming languages) data structures.

  • A collection of name/value pairs. Different programming languages support this data structure in different names. Like object, record, struct, dictionary, hash table, keyed list, or associative array.
  • An ordered list of values. In various programming languages, it is called as array, vector, list, or sequence.


An object starts with an open curly bracket { and ends with a closed curly bracket } Between them, a number of key value pairs can reside. The key and value are separated by a colon : and if there are more than one KV pair, they are separated by a comma ,


  “Students“: [

                                      { “Name“:”Amit Goenka” ,

  “Major“:”Physics” },

                                      { “Name“:”Smita Pallod” ,

  “Major“:”Chemistry” },

                                      { “Name“:”Rajeev Sen” ,

  “Major“:”Mathematics” }




An Array starts with an open bracket [ and ends with a closed bracket ]. Between them, a number of values can reside. If more than one values reside, they are separated by a comma , .



  “name“: “Bidhan Chatterjee”,

  “email“: “”



  “name“: “Rameshwar Ghosh”,

  “email“: “”




JSON Format Validation:
Now that we’re a bit more familiar with the structure Splunk expects to extract from, let’s work with a sample. The sample data is JSON wrapped in a syslog header. While this data can be ingested as is, you will have to manually extract each field if you choose to not reformat it. You can validate the structure by copying this event to .

Sample Data:

May 14 13:28:51 <redacted_hostname> github_audit[22200]: { “above_lock_quota”:false, “above_warn_quota”:false, “babeld”:”eebf1bc7″, “babeld_proto”:”http”, “cloning”:false, “cmdline”:”/usr/bin/git upload-pack –strict –timeout=0 –stateless-rpc .”, “committer_date”:”1589477330 -0400″, “features”:” multi_ack_detailed no-done side-band-64k thin-pack include-tag ofs-delta agent=git/″, “frontend”:”<redacted>”, “frontend_pid”:17688, “frontend_ppid”:6744, “git_dir”:”/data/user/repositories/7/nw/75/42/9d/4564/6435.git”, “gitauth_version”:”dcddc67b”, “hostname”:”<redacted>”, “pgroup”:”22182″, “pid”:22182, “ppid”:22181, “program”:”upload-pack”, “quotas_enabled”:false, “real_ip”:”″, “remote_addr”:”″, “remote_port”:”15820″, “repo_config”:”{\”ssh_enabled\”:\”true\”,\”ldap.debug_logging_enabled\”:\”true\”,\”auth.reactivate-suspended\”:\”true\”,\”default_repository_permission\”:\”write\”,\”allow_private_repository_forking\”:\”true\”}”, “repo_id”:6435, “repo_name”:”<redacted>”, “repo_public”:true, “request_id”:”43358116096ea9d54f31596345a0fc38″, “shallow”:false, “status”:”create_pack_file”, “uploaded_bytes”:968 }


The errors are noted and highlighted below:

As we can see, the timestamp, hostname and thread field are outside of the JSON object.


Replace strings in events with SEDCMD

You can use the SEDCMD method to replace strings or substitute characters. This must be placed on the parsing queue prior to index time. The syntax for a sed replace is:

SEDCMD-<class> = s/<regex>/\1<replacement>/flags

  • <class> is the unique stanza name. This is important because these are applied in ABC order
  • regexis a Perl language regular expression
  • replacementis a string to replace the regular expression match.
  • flagscan be either the letter g to replace all matches or a number to replace a specified match.
  • \1 – use this flag to insert the string back into the replacement


How To Test in Splunk:

Copy the data sample text into a notepad file and upload using Splunk’s built in Add Data feature under Settings to test. Try out each SEDCMD and note the difference in the data structure for each attribute.



Props.conf – Search Time Field Extraction


KV_MODE = json


Want to learn more about JSON structured data & the SEDCMD in Splunk? Contact us today!