AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Scala version1/29/2024 So please make sure that all the libraries have 2.11 version before you make decision to change scala version. My project had a fair bit of them and luckily all of those libraries had scala 2.11 version. One of the major challenges of changing scala version is to update all the project dependencies. So I started using Java 8 for building and deploying the code. Choosing Right Java Versionįrom Spark 2.1.0 version, support for Java 7 has been deprecated. So I chose 2.11.8 as my scala version for upgrade. So this meant investing in 2.10 will be not good as it will be obsolete in next few versions. Also I came across the jira which discusses about removing scala 2.10 support altogether in 2.3.0. So to support 2.10 I have to build my own distribution. But I soon realised the distribution at spark download page is only built using scala 2.11. All the other external libraries needed no change and it was smooth. Initially I started the upgrade using Scala 2.10 as it was least resistance path. It’s a significant work as you need to comb through each and every dependency and make sure right version exist. So whenever you change the scala version of the project, you need to upgrade all the libraries of the project including non-spark ones. Scala major versions are non binary compatible, i.e you cannot mix and match the libraries built using 2.10 and 2.11. 2.10 version is still supported even though it’s not default. But from spark 2.0, the default version is changed to 2.11.8. In spark 1.x, spark was built using scala version 2.10.6. When you want to upgrade from spark 1.x to spark 2.x, first task is to pick the right scala version. In this post, we will discuss how to upgrade our dependencies to add right support for spark 2.0. In this series of posts, I will be documenting my experience of migration so it may help all the ones out there who are planning to do the same. In last few weeks, I was involved in migrating one of fairly large code base and found it quite involving process. To keep up to date with the latest updates, one need to migrate their spark 1.x code base to 2.x. With performance boost, this version has made some of non backward compatible changes to the framework. Here, the choice of scala.CanEqual is arbitrary, it could be any of the small number of classes that are in scala3-library but not scala-library.īut if you are tempted to go that route, you might instead consider including version-specific source in your project, or passing the Scala version via sbt-buildinfo.Spark 2.0 brings a significant changes to abstractions and API’s of spark platform. perhaps there's a cleaner/better way, you tell me, but one way that works is: util.Try(Class.forName("scala.CanEqual")).isSuccess If the compiler isn't on your classpath but you just want to find out at runtime whether you're on Scala 2 or 3, well. If the compiler isn't on your classpath and you want the full Scala 3 version string, see Dmitrii's answer. ![]() (In the standard Scala 3 REPL, it is in some other environments, it might not be.) This only works if the scala3-compiler JAR is on your classpath. Here's how to get the Scala 3 compiler version: scala> .Properties.simpleVersionString (Eventually the standard libraries will no longer remain synchronized like this.) That's because Scala 3.0.x uses the Scala 2 standard library as-is, to aid migration, and makes only a small number of additions. Scala code runner version 3.0.1 - Copyright 2002-2021, LAMP/EPFL Scala> įor Scala 3, if you do the same thing, you may be surprised by the answer: % scala3 -version ![]() ![]() For Scala 2, use (or versionString): scala>
0 Comments
Read More
Leave a Reply. |