Auto Added by WPeMatico

Dark Theme Is Now Available in Toolbox App 1.18.

It’s been a while since you made this wish, and now we’ve finally made it come true! We are happy to introduce the frequently requested feature – the Dark Theme. :tada:

Don’t have the Toolbox App yet? Click the link below to download the free Toolbox App and start working with the theme you like the most.

Download now

You shouldn’t wait any longer to see it in action. Just update your Toolbox App to version 1.18, if you haven’t set the Toolbox App to update automatically and select the Dark Theme in the “Appearance & Behavior” section of the Toolbox App Settings.

Dark Theme

Currently, the app offers two options – Light or Dark Theme – which you can change manually. Go to the Toolbox App Settings and choose the theme you like under the “Appearance & Behavior”.
Theme Settings

Bug Fixes 🛠

In the same release, we’ve fixed the following issues:

TBX-4898 – Generation of shell scripts on macOS for Android Studio 4.0 and 4.1 now works correctly.
TBX-4985 – Toolbox now correctly updates taskbar shortcuts on Windows.
TBX-5031, TBX-5066 – The uninstall process on Windows now works correctly.
TBX-5199 – Toolbox now updates Linux .desktop files if there is a broken symlink.
TBX-5233 – We’ve fixed a bug that caused Android Studio 4.1 not to start from the Toolbox App on macOS.

See the full list of fixed issues here.

As always, the Toolbox App team is happy to get your feedback! Leave us a message in our issue tracker or on Twitter by mentioning @JBToolbox.

Stay safe, and stay productive!
The Toolbox App team

Continue Reading Dark Theme Is Now Available in Toolbox App 1.18.

Introducing Kotlin for Apache Spark Preview

Apache Spark is an open-source unified analytics engine for large-scale distributed data processing. Over the last few years, it has become one of the most popular tools used for processing large amounts of data. It covers a wide range of tasks – from data batch processing and simple ETL (Extract/Transform/Load) to streaming and machine learning.

Due to Kotlin’s interoperability with Java, Kotlin developers can already work with Apache Spark via Java API. This way, however, they cannot use Kotlin to its full potential, and the general experience is far from smooth.

Today, we are happy to share the first preview of the Kotlin API for Apache Spark. This project adds a missing layer of compatibility between Kotlin and Apache Spark. It allows you to write idiomatic Kotlin code using familiar language features such as data classes and lambda expressions.

Kotlin for Apache Spark also extends the existing APIs with a few nice features.

withSpark and withCached functions


is a simple and elegant way to work with SparkSession that will automatically take care of calling


at the end of the block for you.
You can pass parameters to it that may be required to run Spark, such as master location, log level, or app name. It also comes with a convenient set of defaults for running Spark locally.

Here’s a classic example of counting occurrences of letters in lines:

val logFile = "a/path/to/logFile.txt" 
withSpark(master = "yarn", logLevel = SparkLogLevel.DEBUG){ {
       val numAs = filter { it.contains("a") }.count()
       val numBs = filter { it.contains("b") }.count()
       println("Lines with a: $numAs, lines with b: $numBs")

Another useful function in the example above is


. In other APIs, if you want to fork computations into several paths, but compute things only once, you would call the ‘cache’ method. However, this quickly becomes difficult to track and you have to remember to unpersist the cached data. Otherwise, you risk taking up more memory than intended or even breaking things altogether.


takes care of tracking and unpersisting for you.

Null safety

Kotlin for Spark adds




, and other aliases to the existing methods, however, these are null safe by design.

fun main() {

    data class Coordinate(val lon: Double, val lat: Double)
    data class City(val name: String, val coordinate: Coordinate)
    data class CityPopulation(val city: String, val population: Long)

    withSpark(appName = "Find biggest cities to visit") {
        val citiesWithCoordinates = dsOf(
                City("Moscow", Coordinate(37.6155600, 55.7522200)),
                // ...

        val populations = dsOf(
                CityPopulation("Moscow", 11_503_501L),
                // ...
        citiesWithCoordinates.rightJoin(populations, citiesWithCoordinates.col("name") `==` populations.col("city"))
                .filter { (_, citiesPopulation) ->
                    citiesPopulation.population > 15_000_000L
                .map { (city, _) ->
                    // A city may potentially be null in this right join!!!

Note the


line in the example above. A city may potentially be null in this right join. This would’ve caused a


in other JVM Spark APIs, and it would’ve been rather difficult to debug the source of the problem.
Kotlin for Apache Spark takes care of null safety for you and you can conveniently filter out null results.

What’s supported

This initial version of Kotlin for Apache Spark supports Apache Spark 3.0 with the core compiled against Scala 2.12.

The API covers all the methods needed for creating self-contained Spark applications best suited for batch ETL.

Getting started with Kotlin for Apache Spark

To help you quickly get started with Kotlin for Apache Spark, we have prepared a Quick Start Guide that will help you set up the environment, correctly define dependencies for your project, and run your first self-contained Spark application written in Kotlin.

What’s next

We understand that it takes a while to upgrade any existing framework to a newer version, and Spark is no exception. That is why in the next update we are going to add support for the earlier Spark versions: 2.4.2 – 2.4.6.

We are also working on the Kotlin Spark shell so that you can enjoy working with your data in an interactive manner, and perform exploratory data analysis with it.

Currently, Spark Streaming and Spark MLlib are not covered by this API, but we will be closely listening to your feedback and will address it in our roadmap accordingly.

In the future, we hope to see Kotlin join the official Apache Spark project as a first-class citizen. We believe that it can add value both for Kotlin, and for the Spark community. That is why we have opened a Spark Project Improvement Proposal: Kotlin support for Apache Spark. We encourage you to voice your opinions and join the discussion.

Go ahead and try Kotlin for Apache Spark and let us know what you think!

Continue Reading Introducing Kotlin for Apache Spark Preview

Dokka Preview Based on Kotlin 1.4.0-RC

The following post is written by Paweł Marks and Kamil Doległo.

Every programming ecosystem needs documentation to thrive. Kotlin has its roots in the JVM ecosystem, where Javadoc is a standard and universally accepted documentation engine. It was only natural to expect Kotlin to have a similarly seamless tool. That was the initial goal of Dokka – to provide a reliable and simple documentation engine. But the increasing diversity of Kotlin, with features like multiplatform projects, Native support, and so on, requires Dokka to be more complex.
The ongoing development of Kotlin 1.4 gave us a chance to rethink, redesign, and reimplement Dokka from scratch (its version number is now aligned with the Kotlin embedded compiler). In this post, we give you an overview of Dokka’s new features and announce its preview release. We would appreciate it if you could try the preview and share your feedback.


How to Try

Dokka is distributed as a plugin for two of the most popular Kotlin build tools – Maven and Gradle. For advanced use cases, Dokka can be used as a standalone command-line application. For a Gradle-based project, just add the following to your project’s build.gradle or build.gradle.kts files:

plugins {
    id("org.jetbrains.dokka") version "1.4.0-rc"

repositories {

After running

./gradlew dokkaHtml

you should see generated documentation in the


directory inside your project’s build directory.

New HTML format

One of our main goals was to produce good-looking, modern documentation without any need for tweaking and configuration from the user. That is why the default look of Dokka is no longer a simple static web page. Let’s take a quick look at the most important features of the new HTML format. Note that we’ve used the coroutines documentation to demonstrate the new Dokka format, but we won’t actually be converting this documentation until a later time.

Navigation tree

On the left side of your screen, you can see all your project modules, packages, and types organized in a hierarchical menu. This allows you to not only see the project structure at a glance, but also to quickly navigate between different parts of your codebase.

Search bar

If your project is particularly complex, or you don’t know its codebase structure and you need to find some type or function, you can use the built-in search bar. It knows all symbols in the project so it can offer IDE-like autocompletion for search queries.

Linking to sources

Dokka can link from any symbol to its definition in code if your project’s code is hosted online. Just configure your repo URL using the


option as a generation parameter and all your functions and types will get links pointing to the exact line of code where each of them was defined.

Information about platforms

Kotlin is a multiplatform language and its documentation engine needs to reflect all platform information that can be important for end-users. In the documentation for a multiplatform project, you will notice that every symbol is marked with a badge indicating to which platform it applies. Moreover, if there are any differences between documentation or signatures of the same symbol on different platforms you can use the tabs to switch between them.

On every page you can choose to show or hide symbols defined in specific platforms.

Runnable samples

Kotlin has a great tool called Kotlin Playground that allows you to run simple code snippets from your browser. This tool is now also integrated with Dokka. You can specify the source of your samples and they will be included in your documentation, allowing end users to check how to properly use your library. All you have to do is create an ordinary Kotlin file with your code, include it in the documentation using

samples = listOf(<paths to the files>)

and link to the desired method using

@sample <fully qualified name of your method>

and Dokka will automatically copy the source code and use it to create a runnable block.

Other formats


If you want to host your documentation on GitHub pages or other sites that use markdown formatting, Dokka has that covered. Out of the box, it supports two different flavors of Markdown – Jekyll and GFM (GitHub Flavored Markdown). To use them you only need to run





Kotlin-as-java and javadoc

Kotlin is interoperable with Java, which means that Kotlin libraries that target the JVM can be used from Java projects. Dokka can generate documentation for them too. You just need to include


for HTML format or




for GFM and Jekyll formats respectively. You will see your classes and functions in a Java-like format, for example, class properties will be changed to the appropriate




methods. Thanks to our new architecture, the Kotlin-as-java plugin works with HTML, GFM, and Jekyll Markdown, as well as some user-defined formats.

You can go one step further and not only desugar your properties to getters and setters but also generate documentation in the same way as Javadoc does. You just need to run the


task. The new Javadoc generation is completely independent of JDK artifacts, so it is guaranteed to work with every modern version of JDK (8 and newer).

Multimodule projects

By default, Dokka generates one set of documentation per Kotlin module.

  • Run the

    task to collect all definitions for all modules and document them as if they were a single module. This way the code can be divided into small modules while still having one documentation set.

  • Run the

    task to generate documentation for each module in a separate directory and then create a common page with an index that links to all modules with brief previews of their documentation. You can customize this page by providing your own template in a Markdown file.

Other features

Even though Dokka 1.4 is completely rewritten, we wanted to preserve all the useful features and configuration options from previous releases. You can still:

  • Generate documentation for mixed Java and Kotlin sources.
  • Include Markdown documentation for the project, module, and package pages.
  • Generate documentation for non-public symbols.
  • Receive a report after generation about each undocumented symbol.
  • Specify all generation options on a per-package basis.


The new Dokka has a flexible and powerful plugin system. Jekyll, GFM, kotlin-as-java, and Javadoc are only a few examples of plugins that can be created for the new Dokka. Everyone can now provide a custom format or transform the documentation in any way imaginable. Thanks to its robust framework it is intuitive to use and hard to accidentally break something. If you are interested in plugin development, check out our developers’ guide.

Continue Reading Dokka Preview Based on Kotlin 1.4.0-RC

Kotlin 1.4.0-RC: Debugging coroutines

We continue to highlight the upcoming changes in 1.4 release. In this blogpost, we want to describe a couple of important features related to coroutines:

  • New functionality to conveniently debug coroutines
  • The ability to define deep recursive functions

These changes are already available for you to try in the 1.4.0-RC release!

Let’s dive into details.

Debugging coroutines

Coroutines are great for asynchronous programming (but not only for that), and many people already use them or are starting to use them. When you write code with coroutines, however, trying to debug them can be a real pain. Coroutines jump between threads. It can be difficult to understand what a specific coroutine is doing or to check its context. And in some cases, tracking steps over breakpoints simply doesn’t work. As a result, you have to rely on logging or mental effort to debug the code with coroutines. To address this issue, we’re introducing new functionality in the Kotlin plugin that aims to make debugging coroutines much more convenient.

The Debug Tool Window now contains a new Coroutines tab. It is visible by default, and you can switch it on and off:

In this tab, you can find information about both currently running and suspended coroutines. The coroutines are grouped by the dispatcher they are running on. If you started a coroutine with a custom name, you can find it by this name in the Tool Window. In the following example, you can see that the main coroutine is running (we’ve stopped on a breakpoint inside it), and the other four coroutines are suspended:

import kotlinx.coroutines.*

fun main() = runBlocking {
   repeat(4) {
       launch(Dispatchers.Default + CoroutineName("Default-${'a' + it}")) {
           val name = coroutineContext[CoroutineName.Key]?.name
           println("I'm '$name' coroutine")
   // breakpoint
   println("I'm the main coroutine")

With the new functionality, you can check the state of each coroutine and see the values of local and captured variables. This also works for suspended coroutines!

In this example, we check the values of the local variables of suspended coroutines:

import kotlinx.coroutines.*

fun main() = runBlocking<Unit> {
   launch {
       val a = 3
   launch {
       val b = 2
   launch {
       val c = 1
       // breakpoint here:

Choose a suspended coroutine (click


to see its state on that point) and the


tab will show you the state of the local variables:

You can now see a full coroutine creation stack, as well as a call stack inside the coroutine:

Use the ‘Get Coroutines Dump’ option to get a full report containing the state of each coroutine and its stack:

At the moment, the coroutines dump is still rather simple, but we’re going to make it more readable and helpful in future versions.

Note that to make the debugger stop at a given breakpoint inside a coroutine, this breakpoint should have the “Suspend: All” option chosen for it:

To try this new functionality for debugging coroutines, you need to use the latest version of kotlinx.coroutines, 1.3.8-1.4.0-rc, and the latest version of the Kotlin plugin (e.g. 1.4.0-rc-release-IJ2020.1-2).

The functionality is available only for Kotlin/JVM. If you encounter any problems (please don’t forget to share the details with us!), you can switch it off by opening Build, Execution, Deployment | Debugger | Data Views | Kotlin in Preferences and choosing Disable coroutines agent. For now, we’re releasing this functionality for debugging coroutines in the experimental state, and we’re looking forward to your feedback!

Defining deep recursive functions using coroutines

In Kotlin 1.4, you can define recursive functions and invoke them even when the call depth is greater than 100,000, using the standard library support based on coroutines!

Let’s first look at an ordinary recursive function, whose usage results in a


when the recursion depth gets too high. After that, we’ll discuss how you can fix the problem and rewrite the function using the Kotlin standard library.

We’ll use a simple binary tree, where each


node has a reference to its





class Tree(val left: Tree?, val right: Tree?)

The depth of the tree is the length of the longest path from its root to its child nodes. It can be computed using the following recursive function:

fun depth(t: Tree?): Int =
   if (t == null) 0 else maxOf(
   ) + 1

The tree depth is the maximum of the depths of the left and right children increased by one. When the tree is empty it’s zero.

This function works fine when the recursion depth is small:

class Tree(val left: Tree?, val right: Tree?)

fun depth(t: Tree?): Int =
   if (t == null) 0 else maxOf(
   ) + 1

fun main() {
    val tree = Tree(Tree(Tree(null, null), null), null)
    println(depth(tree)) // 3

However, if you create a tree with a depth greater than 100,000, which in practice is not so uncommon, you’ll get


as a result:

class Tree(val left: Tree?, val right: Tree?)

fun depth(t: Tree?): Int =
   if (t == null) 0 else maxOf(
   ) + 1
fun main() {
   val n = 100_000
   val deepTree = generateSequence(Tree(null, null)) { prev ->
       Tree(prev, null)

Exception in thread "main" java.lang.StackOverflowError
  at FileKt.depth(File.kt:5)

The problem is that the call stack gets too large. To solve this issue, you can use a VM option to increase the maximum stack size. However, while this might work for specific use cases, it’s not a practical solution for the general case.

Alternatively, you can rewrite the code and store results for intermediate calls by hand in the heap rather than on the stack. This solution works in most cases and is common in other languages. However, the resulting code becomes non-trivial and complicated, and the beauty and simplicity of the initial function are lost. You can find an example here.

Kotlin now provides a clean way to solve this problem based on the coroutines machinery.

The Kotlin library now includes the definition


, which models recursive calls using the suspension mechanism:

class Tree(val left: Tree?, val right: Tree?)

val depthFunction = DeepRecursiveFunction<Tree?, Int> { t ->
   if (t == null) 0 else maxOf(
   ) + 1

fun depth(t: Tree) = depthFunction(t)

fun main() {
   val n = 100_000
   val deepTree = generateSequence(Tree(null, null)) { prev ->
       Tree(prev, null)

   println(depth(deepTree)) // 100000

You can compare the two versions, the initial one and the one using


, to make sure that the logic remains the same. Your new function now becomes a variable of type


, which you can call using the ‘invoke’ convention as


. The function body now becomes the body of the lambda argument of


, and the recursive call is replaced with


. These changes are straightforward and easy to make. Note that while the new


function uses coroutines under the hood, it is not itself a



Understanding how


is implemented is interesting, but it is not necessary in order for you to use it and benefit from it. You can find the implementation details described in this blog post.


is a part of the Kotlin standard library, not part of the


library, since it’s not about asynchronous programming. At the moment this API is still experimental, so we’re looking forward to your feedback!

How to try it

As always, you can try Kotlin online at

In IntelliJ IDEA and Android Studio, you can update the Kotlin Plugin to version 1.4.0-RC. See how to do this.

If you want to work on existing projects that were created before installing the preview version, you need to configure your build for the preview version in Gradle or Maven. Note that unlike the previous preview versions, Kotlin 1.4.0-RC is also available directly from Maven Central. This means you won’t have to manually add the


repository to your build files.

You can download the command-line compiler from the Github release page.

Share your feedback

We’ll be very thankful if you find and report bugs to our issue tracker. We’ll try to fix all the important issues before the final release, which means you won’t need to wait until the next Kotlin release for your issues to be addressed.

You are also welcome to join the #eap channel in Kotlin Slack (get an invite here). In this channel, you can ask questions, participate in discussions, and get notifications about new preview builds.

Let’s Kotlin!

Continue Reading Kotlin 1.4.0-RC: Debugging coroutines

Kotlin 1.4.0-RC Released

We’re almost there! We’re happy to unveil Kotlin 1.4.0-RC – the release candidate for the next major version of our programming language. Read on to learn about what has changed in Kotlin 1.4.0-RC, and make sure to try its new features before they are officially released with Kotlin 1.4.0.

A special thanks to everyone who tried our milestone releases (1.4-M1, 1.4-M2, and 1.4-M3), shared their feedback, and helped us improve this version of Kotlin!

This post highlights the new features and key improvements that are available in Kotlin 1.4.0-RC:

Improved *.gradle.kts IDE support

We significantly improved the IDE support for Gradle Kotlin DSL scripts (*.gradle.kts files) in Kotlin 1.3.70, and we’ve continued to improve it for Kotlin 1.4.0-RC. Here is what this new version brings:

Loading script configuration explicitly for better performance

Previously, when you added a new plugin to the




block of your


, the new script configuration was loaded automatically in the background. Then, after it was applied, you could use code assistance for the newly added plugin.

To improve performance, we’ve removed this automatic behavior of applying changes to the script configuration upon typing. For Gradle 6.0 and above, you need to explicitly apply changes to the configurations by clicking Load Gradle Changes or by reimporting the Gradle project.

In earlier versions of Gradle, you need to manually load the script configuration by clicking Load Configuration in the editor.

We’ve added one more action in IntelliJ IDEA 2020.1 with Gradle 6.0+, Load Script Configurations, which loads changes to the script configurations without updating the whole project. This takes much less time than reimporting the whole project.

Better error reporting

Previously you could only see errors from the Gradle Daemon (a process that runs in the background and is responsible for all Gradle-related tasks and activities) in separate log files. Now if you use Gradle 6.0 or above, the Gradle Daemon returns all the information about errors directly and shows it in the Build tool window. This saves you both time and effort.

Less boilerplate in your project’s configuration

With improvements to the Kotlin Gradle plugin, you can write less code in your Gradle build files: one of the most common scenarios is now enabled by default.

Making the standard library a default dependency

An overwhelming majority of projects require the Kotlin standard library. Starting from 1.4.0-RC, you no longer need to declare a dependency on


in each source set manually — it will now be added by default. The automatically added version of the standard library will be the same as the version of the Kotlin Gradle plugin, since they have the same versioning.

This is how a typical multiplatform project configuration with Android, iOS, and JavaScript targets looked before 1.4:



sourceSets {
    commonMain {
        dependencies {
    androidMain {
        dependencies {

    jsMain {
        dependencies {

    iosMain {
        dependencies {


Now, you don’t need to explicitly declare a dependency on the standard library at all, and with hierarchical project structure support, announced in 1.4-M2, you have to specify other dependencies only once. So your Gradle build file will become much more concise and easy to read:



sourceSets {
    commonMain {
        dependencies {


For platform source sets and backend-shared source sets, the corresponding standard library will be added, while a common standard library will be added to the rest. The Kotlin Gradle plugin will select the appropriate JVM standard library depending on the



If you declare a standard library dependency explicitly (for example, if you need a different version), the Kotlin Gradle plugin won’t override it or add a second standard library. And if you do not need a standard library at all, you can add the opt-out flag to the Gradle properties:



Simplified management of CocoaPods dependencies

Previously, once you integrated your project with the dependency manager CocoaPods, you could build an iOS, macOS, watchOS, or tvOS part of your project only in Xcode, separate from other parts of your multiplatform project. These other parts could be built in IntelliJ IDEA.

Moreover, every time you added a dependency on an Objective-C library stored in CocoaPods (Pod library), you had to switch from IntelliJ IDEA to Xcode, run the task

pod install

, and run the Xcode build there.

Now you can manage Pod dependencies right in IntelliJ IDEA while enjoying the benefits it provides for working with code, such as code highlighting and completion. You can also build the whole Kotlin project with Gradle, without having to switch to Xcode. This means you only have to go to Xcode when you need to write Swift/Objective-C code or run your application on a simulator or device.

Now you can also work with Pod libraries stored locally.

Depending on your needs, you can add dependencies between:

  • A Kotlin project and Pod libraries from the CocoaPods repository.
  • A Kotlin project and Pod libraries stored locally.
  • A Kotlin Pod (Kotlin project used as a CocoaPods dependency) and an Xcode project with one or more targets.

Complete the initial configuration, and when you add a new dependency to CocoaPods, just re-import the project in IntelliJ IDEA. The new dependency will be added automatically. No additional steps are required.

Below you can find instructions on how to add dependencies on Pod libraries from the CocoaPods repository. The Kotlin 1.4 documentation will cover all scenarios.

How to use the CocoaPods integration

Install the CocoaPods dependency manager and plugin
  1. Install the

    dependency manager (

    sudo gem install cocoapods


  2. Install the

    plugin (

    sudo gem install cocoapods-generate


  3. In the

    file of your project, apply the CocoaPods plugin with



    plugins {
       kotlin("multiplatform") version "1.4.0-rc"
       kotlin("native.cocoapods") version "1.4.0-rc"
Add dependencies on Pod libraries from the CocoaPods repository
  1. Add dependencies on Pod libraries that you want to use from the CocoaPods repository with


    You can also add dependencies as subspecs.

    kotlin {
        cocoapods {
            summary = "CocoaPods test library"
            homepage = ""
            pod("AFNetworking", "~> 4.0.0")
            //Remote Pod added as a subspec
  2. Re-import the project.
    To use dependencies from Kotlin code, import the packages:

    import cocoapods.AFNetworking.*
    import cocoapods.SDWebImage.*

We’re also happy to share a sample project with you that demonstrates how to add dependencies on Pod libraries stored both remotely in the CocoaPods repository and locally.

Generate release .dSYMs on Apple targets by default

Debugging an iOS application crash sometimes involves analyzing crash reports, and crash reports generally require symbolication to become properly readable. To symbolicate addresses in Kotlin, the .dSYM bundle for Kotlin code is required. Starting with 1.4-M3, the Kotlin/Native compiler produces .dSYMs for release binaries on Darwin platforms by default. This can be disabled with the


compiler flag. On other platforms, this option is disabled by default. To toggle this option in Gradle, use:

kotlin {
    targets.withType<org.jetbrains.kotlin.gradle.plugin.mpp.KotlinNativeTarget> {
        binaries.all {
            freeCompilerArgs += "-Xadd-light-debug={enable|disable}"

Performance improvements

We continue to focus on optimizing the overall performance of the Kotlin/Native development process:

  • In 1.3.70 we introduced two new features for improving the performance of Kotlin/Native compilation: caching project dependencies and running the compiler from the Gradle daemon. Thanks to your feedback, we’ve managed to fix numerous issues and improve the overall stability of these features, and we will continue to do so.
  • There are also some runtime performance improvements as well. Overall runtime performance has improved because of optimizations in GC. This improvement will be especially apparent in projects with a large number of long-lived objects.



    collections now work faster by escaping redundant boxing.


With Kotlin 1.4.0-RC, we are making the


annotation compatible with the default compiler backend. We are also providing more robust and fine-grained control over npm dependency management and the Dukat integration for Gradle projects, refining our support for CSS, and offering a first look at our integration with the Node.js APIs, among other things.


annotation for default compiler backend

In the previous milestones for Kotlin 1.4, we introduced the


annotation, which is used to make a top-level declaration available from JavaScript or TypeScript when using the new IR compiler backend. Starting with Kotlin 1.4-M3, it is now also possible to use this annotation with the current default compiler backend. Annotating a top-level declaration with


when using the current default compiler backend turns off name mangling for the declaration. Having this annotation in both compiler backends allows you to transition between them without having to adjust your logic for exporting top-level declarations. Please note that the generation of TypeScript definitions is still only available when using the new IR compiler backend.

Changes to npm dependency management

Explicit version requirement for dependency declarations

Declaring dependencies on npm packages without specifying a version number makes it harder to reliably manage the packages you use. This is why you are required from now on to explicitly specify a version or version range based on npm’s semver syntax for dependencies. The Gradle DSL now also supports multiple ranges for dependencies, allowing you to pinpoint exactly which versions you want to accept in your project, for example:

dependencies {
    implementation(npm("react", "> 14.0.0 <=16.9.0))

Additional types of npm dependencies

Besides regular dependencies from npm, which you can specify using


inside your


block, there are now three more types of dependencies that you can use:

To learn more about when each type of dependency can best be used, have a look at the official documentation linked from npm.

Automatic inclusion and resolution of transitive npm dependencies

Previously, if you depended on a library whose author did not manually add a


file to its artifacts, you would sometimes be required to manually import its npm dependencies. This was the case for


, for example, which required you to include




as dependencies in your Gradle build file for the package to work on Kotlin/JS.

Now the Gradle plugin automatically generates


files for libraries, and includes them in the




artifacts. When including a library of this sort, the file is automatically parsed and the required dependencies are automatically included, removing the need to add them to your Gradle build file manually.

Adjustments for CSS support

With Kotlin 1.4-M2, we introduced support for webpack’s CSS and style loaders directly from Gradle via


. In order to more closely reflect its actual tasks and effects, we have since renamed the configuration parameter to


. Going forward, the Gradle plugin no longer enables CSS support by default – a setting we had experimented with in 1.4-M2. We hope that this change will prevent confusion for those who include their own settings for how style sheets should be handled (for example by using Sass or Less loaders). In these situations, it would not be immediately obvious that a project’s default configuration already injects some CSS settings that could lead to a conflict.

To turn on CSS support in your project, set the


flag in your Gradle build file for




, and


. When creating a new project using the wizards included in IntelliJ IDEA, these settings will automatically be included in the generated



webpackTask {
   cssSupport.enabled = true
runTask {
   cssSupport.enabled = true
testTask {
   useKarma {
      // . . .
      webpackConfig.cssSupport.enabled = true

We realize that having to adjust these settings individually for each task is not very convenient. We are looking at adding a central point of configuration for


in the plugin’s DSL (you can follow our progress here).

Improvements for Dukat integration

The Kotlin/JS Gradle plugin adds more fine-grained control to its integration with Dukat, the tool for automatically converting TypeScript declaration files (


) into Kotlin external declarations. You now have two different ways to select if and when Dukat should generate declarations:

Generating external declarations at build time

The npm dependency function now takes a third parameter after the package name and version:


. This allows you to individually control whether Dukat should generate declarations for a specific dependency, like so:

dependencies {
  implementation(npm("decamelize", "4.0.0", generateExternals = true))

You can use the


flag (formerly named


while it was still experimental) in your file to set the generator’s behavior for all npm dependencies simultaneously. As usual, individual explicit settings take precedence over this general flag.

Manually generating external declarations via Gradle task

If you want to have full control over the declarations generated by Dukat, if you want to apply manual adjustments, or if you’re running into trouble with the auto-generated externals, you can also trigger the creation of the declarations for all your npm dependencies manually using the


Gradle task. This will generate declarations in a directory titled


in your project root. Here, you can review the generated code and copy any parts you would like to use to your source directories. (Please be advised that manually providing external declarations in your source folder and enabling the generation of external declarations at build time for the same dependency can result in resolution issues.)

Migration preparation for kotlin.dom and kotlin.browser to separate artifacts

In order to evolve our browser and DOM bindings for Kotlin/JS faster and decouple them from the release cycle of the language itself, we are deprecating the current APIs located in the




packages. We provide replacements for these APIs in the




packages, which will be extracted to separate artifacts in a future release. Migrating to these new APIs is straightforward. Simply adjust the imports used in your project to point to these new kotlinx packages. Quick-fixes in IntelliJ IDEA, accessible via Alt-Enter, can help with this migration.

Preview: kotlinx-nodejs

We are excited to share a preview of our official bindings for the Node.js APIs


. While it has been possible to target Node.js with Kotlin for a long time, the full potential of the target is unlocked when you have typesafe access to its API. You can check out the


bindings on GitHub.

To add


to your project, make sure


is added to your repositories. You can then simply add a dependency on the artifact:

dependencies {
    // . . .

After loading the Gradle changes, you can then experiment with the API provided by Node.js, for example by making use of their DNS resolution package:

fun main() {
    dns.lookup("") { err, address, family ->
        console.log("address: $address, family IPv$family")

Especially because this is still a preview version, we encourage you to give kotlinx-nodejs a try and report any issues you encounter in the repository’s issue tracker.

Deprecation of kotlin2js and kotlin-dce-js Gradle plugins

Starting with Kotlin 1.4, the old Gradle plugins for targeting JavaScript with Kotlin (




) will be officially deprecated in favor of the


Gradle plugin.
Key functionality that was available in these plugins, alongside the


(which was already deprecated previously) has been condensed into the new plugin, allowing you to configure your Kotlin/JS target using a unified DSL that is also compatible with Kotlin/Multiplatform projects.

Since Kotlin 1.3.70, dead code elimination (DCE) has been applied automatically when using the




tasks, which run and create optimized bundles of your program. (Please note that dead code elimination is currently only available when targeting the browser for production output, not for Node.js or tests. But if you have additional use cases you’d like to see addressed, feel free to share them with us on YouTrack)

Additional quality-of-life improvements and notable fixes

  • We have added more compiler errors for prohibited usages of the

    annotation to highlight such problems.

  • When using the IR compiler backend, we have enabled a new strategy that includes incremental compilation for

    s, which is one of many steps we are taking to improve compilation time.

  • The configuration for the webpack development server has been adjusted, preventing errors like
    ENOENT: no such file or directory

    when using the hot reload functionality.

Evolving the Kotlin Standard Library API

Kotlin 1.4 is a feature release in terms of Kotlin’s evolution, so it brings a lot of new features that you already know about from previous blog posts. However, another important aspect of a feature release is that it includes significant evolutionary changes in the existing API. Here’s a brief overview of the changes you can expect with the 1.4 release.

Stabilization of the experimental API

In order to ship the new things you want to see in Kotlin libraries as fast as possible, we provide experimental versions of them. This status indicates that work on the API is still in progress and that it could be changed incompatibly in the future. When you try to use the experimental API, the compiler warns you about its status and requires an opt-in (



In feature releases, experimental APIs can be promoted to stable. At this point, we guarantee that their form and behavior won’t change suddenly (changes are only possible with a proper deprecation cycle). Once an API is officially stable, you can use the API safely without warnings or opt-ins.

With 1.4, we are promoting a number of experimental functions in the Kotlin libraries to stable. Here are some examples, along with versions in which they were introduced:

More API functions and classes are becoming stable in 1.4. Starting from this version (1.4.0-RC), using them in your project won’t require the



Deprecation cycles

Feature releases also involve taking the next steps in existing deprecation cycles. While in incremental releases we only start new deprecation cycles with the


level, in feature releases we tighten them to


. In turn, API elements that already have the


level can be completely hidden from new uses in code and only remain in binary form to preserve compatibility for already compiled code. Together, these steps ensure the gradual removal of deprecated API elements.

If your code uses API elements with a deprecation level of


, the compiler warns you about such usages. When you update to Kotlin 1.4.0-RC, some of these warnings will turn into errors. Use the IDE prompts to properly replace erroneous usages with the provided alternatives and make sure that your code compiles again.

Detailed information about breaking changes in the Kotlin Standard Library API can be found in the Compatibility Guide for Kotlin 1.4.


We skipped this section in a couple of previous blog posts, but we haven’t stopped working on Kotlin scripting to make it more stable, faster, and easier to use in 1.4. In the RC version, you can already observe better performance along with numerous fixes and functional improvements.

Artifacts renaming

In order to avoid confusion about artifact names, we’ve renamed




to just






). These artifacts depend on the


artifact, which shades the bundled third-party libraries to avoid usage conflicts. With this renaming, we’re making the usage of


(which is safer in general) the default for scripting artifacts.
If, for some reason, you need artifacts that depend on the unshaded


, use the artifact versions with the


suffix, such as


. Note that this renaming affects only the scripting artifacts that are supposed to be used directly; names of other artifacts remain unchanged.

CLion IDE plugin is now deprecated

We’ve launched a deprecation cycle for the CLion IDE plugin. Originally it was intended to be used to debug Kotlin/Native executables. Now this capability is available in IntelliJ IDEA Ultimate. We’ll stop publishing the CLion IDE plugin after the 1.4 release. Please contact us if this deprecation causes any problems. We will do our best to help you to solve them.


As in all major releases, some deprecation cycles of previously announced changes are coming to an end with Kotlin 1.4. All of these cases were carefully reviewed by the language committee and are listed in the Compatibility Guide for Kotlin 1.4. You can also explore these changes on YouTrack.

Release candidate notes

Now that we’ve reached the final release candidate for Kotlin 1.4, it is time for you to start compiling and publishing! Unlike previous milestone releases, binaries created with Kotlin 1.4.0-RC are guaranteed to be compatible with Kotlin 1.4.0.

How to try the latest features

As always, you can try Kotlin online at

In IntelliJ IDEA and Android Studio, you can update the Kotlin Plugin to version 1.4.0-RC. See how to do this.

If you want to work on existing projects that were created before installing the preview version, you need to configure your build for the preview version in Gradle or Maven. Note that unlike the previous preview versions, Kotlin 1.4.0-RC is also available directly from Maven Central. This means you won’t have to manually add the


repository to your build files.

You can download the command-line compiler from the GitHub release page.

You can use the following versions of the libraries published together with this release:

The release details and the list of compatible libraries are also available here.

Share your feedback

We’ll be very thankful if you find and report bugs to our issue tracker. We’ll try to fix all the important issues before the final release, which means you won’t need to wait until the next Kotlin release for your issues to be addressed.

You are also welcome to join the #eap channel in Kotlin Slack (get an invite here). In this channel, you can ask questions, participate in discussions, and get notifications about new preview builds.

Let’s Kotlin!

External contributions

We’d like to thank all of our external contributors whose pull requests were included in this release:

Continue Reading Kotlin 1.4.0-RC Released

End of content

No more pages to load