Welcome to the Java Programming Forums


The professional, friendly Java community. 21,500 members and growing!


The Java Programming Forums are a community of Java programmers from all around the World. Our members have a wide range of skills and they all have one thing in common: A passion to learn and code Java. We invite beginner Java programmers right through to Java professionals to post here and share your knowledge. Become a part of the community, help others, expand your knowledge of Java and enjoy talking with like minded people. Registration is quick and best of all free. We look forward to meeting you.


>> REGISTER NOW TO START POSTING


Members have full access to the forums. Advertisements are removed for registered users.

Results 1 to 2 of 2

Thread: Java Hashtable, HashMap, ConcurrentHashMap – Performance impact

  1. #1
    Member
    Join Date
    Nov 2017
    Location
    USA
    Posts
    93
    Thanks
    6
    Thanked 1 Time in 1 Post

    Default Java Hashtable, HashMap, ConcurrentHashMap – Performance impact


    There are a good number of articles that articulate functional differences between HashMap, HashTable and ConcurrentHashMap. This post compares the performance behavior of these data structures through practical examples. If you don’t have patience to read the entire post, here is bottom line: When you confront with the decision of whether to use HashMap or HashTable or ConcurrentHashMap, you can consider using ConcurrentHashMap since it’s thread-safe implementation, without compromise in performance.

    Performance Study
    To study the performance characteristics, I have put-together this sample program:

    1 public class HashMapPerformance {
    2
    3   public static int ITERATION_COUNT = 10000000;
    4   
    5   private static AtomicInteger exitThreadCount = new AtomicInteger(0);   
    6   
    7   public static HashMap<Integer, Integer> myHashMap;
    8   
    9   public static void initData() {
    10      
    11      myHashMap = new HashMap<>(1000);
    12      
    13      for (int counter = 0; counter < 1000; ++counter) {
    14         
    15         myHashMap.put(counter, counter);
    16      }      
    17   }
    18   
    19   private static class Writer extends Thread{
    20
    21      public void run() {
    22         
    23         Random random = new Random();
    24         
    25         for (int iteration = 0; iteration < ITERATION_COUNT; ++iteration) {
    26            
    27            int counter = random.nextInt(1000 - 1);         
    28            myHashMap.put(counter, counter);            
    29         }
    30         
    31         exitThreadCount.incrementAndGet();
    32      }
    33   }
    34   
    35   private static class Reader extends Thread{
    36
    37      public void run() {
    38         
    39         Random random = new Random();
    40         
    41         for (int iteration = 0; iteration < ITERATION_COUNT; ++iteration) {
    42
    43            int counter = random.nextInt(1000 - 1);         
    44            myHashMap.get(counter);
    45         }
    46         
    47         exitThreadCount.incrementAndGet();
    48      }
    49   }      
    50
    51   public static void main (String args[]) throws Exception {
    52      
    53      initData();
    54      
    55      long start = System.currentTimeMillis();
    56      
    57      // Create 10 Writer Threads
    58      for (int counter = 0; counter < 10; ++counter) {
    59         
    60         new Writer().start();
    61      }
    62      
    63      // Create 10 Reader Threads
    64      for (int counter = 0; counter < 10; ++counter) {
    65         
    66         new Reader().start();
    67      }
    68      
    69      // Wait for all threads to complete
    70      while (exitThreadCount.get() < 20) {
    71         Thread.sleep(100);
    72      }
    73      
    74      System.out.println("Total execution Time(ms): " + (System.currentTimeMillis() - start) );      
    75   }   
    76 }

    This program triggers multiple threads to do concurrent read and write to the ‘java.util.HashMap’.

    Let’s walk through this code. Primary object in this program is ‘myHashMap’ which is defined in line #7. This object is of type ‘java.util.HashMap’ and it’s initialized with 1000 records in the method ‘initData()’, which is defined in line #9. Both key and value in the HashMap have the same integer value. Thus this HashMap will look like as shown in the below diagram:

    Key Value
    0 0
    1 1
    2 2
    3 3
    : :
    : :
    1000 1000

    Fig: Data in the HashMap

    ‘Writer’ Thread is defined in line #19. This thread generates a random number between 0 to 1000 and inserts the generated number into the HashMap, repeatedly for 10 million times. We are randomly generating numbers so that records can be inserted into different parts of the HashMap data structure. Similarly, there is a ‘Reader’ Thread defined in line #35. This thread generates a random number between 0 to 1000 and reads the generated number from the HashMap.

    You can also notice ‘main()’ method defined in line #51. In this method you will see 10 ‘Writer’ threads are created & launched. Similarly, 10 ‘Reader’ threads are created & launched. Then in line 70, there is code logic which will prevent the program from terminating until all the Reader and Writer threads complete their work.

    HashMap Performance
    We executed the above program several times. Average execution time of the program was 3.16 seconds

    Hashtable Performance
    In order to study the Hashtable performance, we replaced the line #7 with ‘java.util.Hashtable’ and modified the ‘Reader’ and ‘Writer’ threads to read and write from the ‘HashTable’. We then executed the program several times. Average execution time of the program was 56.27 seconds.

    ConcurrentHashMap Performance
    In order to study the HashTable performance, we basically replaced the line #7 with ‘java.util.concurrent.ConcurrentHashMap’ and modified the ‘Reader’ and ‘Writer’ threads to read and write from the ‘ConcurrentHashMap’. We then executed the program several times. Average execution time of the program was 4.26 seconds.

    HashMap, Hashtable, ConcurrentHashMap performance comparison
    Below table summarizes the execution time of each data structure:

    Data structure Execution Time (secs)
    HashMap 3.16
    ConcurrentHashMap 4.26
    Hashtable 56.27

    If you notice HashMap has the best performance, however it’s not thread-safe. It has a scary problem that can cause the threads to go on an infinite loop, which would ultimately cause the application’s CPU to spike up.

    If you notice ConcurrentHashMap is slightly slower performing than HashMap, however it’s a 100% thread safe implementation.

    On the other hand Hashtable is also thread safe implementation, but it’s 18 times slower than HashMap for this test scenario.

    Why is Hashtable so slow?
    Hashtable is so slow because both the ‘get()’ and ‘put()’ methods on this object are synchronized (if you are interested, you can see the Hashtable source code here). When a method is synchronized, at any given point in time, only one thread will be allowed to invoke it.

    In our sample program there are 20 threads. 10 threads are invoking ‘get()’ method, another 10 threads are invoking ‘put()’ method. In these 20 threads when one thread is executing, the remaining 19 threads will be in BLOCKED state. Only after the initial thread exits ‘get()’, ‘put()’ method, remaining threads would be able to progress forward. Thus, there is going to be a significant degradation in the performance.

    To confirm this behavior, we executed the above program and captured the thread dump and analyzed it with fastThread( a thread dump analysis tool). Tool generated this interesting analysis report. Below is the excerpt from the report that shows the transitive dependency graph of BLOCKED threads



    Fig: Threads BLOCKED on Hashtable (generated by fastThread)

    Report was showing that 19 threads were in BLOCKED state, while one of the threads (i.e., ‘Thread-15’) is executing the ‘get()’ method in the Hashtable. Thus only after ‘Thread-15’ exits the ‘get()’ method, other threads would be able to progress forward and execute the ‘get()’, ‘put()’ method. This will cause considerable slowdown in the application performance.

    Conclusion
    Thus, if you have a need to use map data structure, you may consider using ConcurrentHashMap, which provides similar performance characteristics of HashMap but at the same time, provides thread safe behavior like Hashtable.

    Video
    https://youtu.be/2qSIZuNiIxM

  2. #2
    Super Moderator Norm's Avatar
    Join Date
    May 2010
    Location
    On vacation
    Posts
    24,756
    Thanks
    57
    Thanked 2,684 Times in 2,634 Posts

    Default Re: Java Hashtable, HashMap, ConcurrentHashMap – Performance impact

    this sample program:
    Can you post code without the leading line numbers so that it can be copied and pasted into a java file for testing?

    Also could you add the comments to the code that are in the text so someone reading the code can understand what the code is doing and do not have to come back to this thread to read the documentation for the code.
    If you don't understand my answer, don't ignore it, ask a question.

Similar Threads

  1. Java UUID generation – Performance impact
    By Ram Lakshmanan in forum Java Programming Tutorials
    Replies: 0
    Last Post: April 7th, 2022, 10:56 PM
  2. The Performance Impact of java.lang.System.getProperty()
    By Ram Lakshmanan in forum Java Programming Tutorials
    Replies: 0
    Last Post: October 28th, 2021, 08:58 AM
  3. Replies: 0
    Last Post: October 26th, 2017, 11:54 AM
  4. PLEASE HELP! (hashtable in Java)
    By hedyeh in forum What's Wrong With My Code?
    Replies: 2
    Last Post: April 12th, 2013, 06:28 AM
  5. ConcurrentHashMap doubt
    By javabuddy in forum Java Theory & Questions
    Replies: 1
    Last Post: June 15th, 2011, 08:13 AM