This is typically used when you do a new String(bytes, "UTF-8") or a Charset.forName("UTF-8").
Looking closer at the Charset class tells us that it is actually using two levels of caches :
and if cannot find you charset in the cache, will use the standardProvider which is a sun.nio.cs.StandardCharsets that extends sun.nio.cs.FastCharsetProvider which implementation is synchronized as you can see :
To prevent this issue from happening, we can directly use a Charset object since Java 1.6 in your code. But regarding all the library that you are using, you will have a hard time patching all of them, as mentioned in this very good post.
Or, we could just patch Java at the source, and then use whatever version of the library and of java that we want, and apply this patch to old systems as well.
Call the NonBlockingCharsetProvider.setUp(); to replace the java provider using reflection by this non blocking one.
This provides two modes, lazy that will get the value from the parent when necessary and put it into a concurrent non blocking hashmap (better that the standard ConcurrentHashMap), and a non lazy that get all the parent values at initialization and provides them with a thread safe guava ImmutableHashMap. Performances are pretty close for both mode, the difference is if you want to duplicate the entire Charsets supported by the JRE into the cache, or just the one that your application is using.
Code source is on Github
Benchmark source as well