How to get around the Scala case class limit of 22 fields?

Solution 1:

More recently (Oct 2016, six years after the OP), the blog post "Scala and 22" from Richard Dallaway explores that limit:

Back in 2014, when Scala 2.11 was released, an important limitation was removed:

Case classes with > 22 parameters are now allowed. 

That said, there still exists a limit on the number of case class fields, please see https://stackoverflow.com/a/55498135/1586965

This may lead you to think there are no 22 limits in Scala, but that’s not the case. The limit lives on in functions and tuples.

The fix (PR 2305) introduced in Scala 2.11 removed the limitation for the above common scenarios: constructing case classes, field access (including copying), and pattern matching (baring edge cases).

It did this by omitting unapply and tupled for case classes above 22 fields.
In other words, the limit to Function22 and Tuple22 still exists.

Working around the Limit (post Scala 2.11)

There are two common tricks for getting around this limit.

  • The first is to use nested tuples.
    Although it’s true a tuple can’t contain more than 22 elements, each element itself could be a tuple

  • The other common trick is to use heterogeneous lists (HLists), where there’s no 22 limit.

If you want to make use of case classes, you may be better off using the shapeless HList implementation. We’ve created the Slickless library to make that easier. In particular the recent mappedWith method converts between shapeless HLists and case classes. It looks like this:

import slick.driver.H2Driver.api._
import shapeless._
import slickless._

class LargeTable(tag: Tag) extends Table[Large](tag, "large") {
  def a = column[Int]("a")
  def b = column[Int]("b")
  def c = column[Int]("c")
  /* etc */
  def u = column[Int]("u")
  def v = column[Int]("v")
  def w = column[Int]("w")

  def * = (a :: b :: c :: /* etc */ :: u :: v :: w :: HNil)
    .mappedWith(Generic[Large])
}

There’s a full example with 26 columns in the Slickless code base.

Solution 2:

This issue is going to be fixed in Scala 2.11.

Solution 3:

Build a normal class that acts like a case class.

I still use scala 2.10.X since that is what is the latest supported by Spark, and in Spark-SQL I make heavy use of case classes.

The workaround for case classes with more than 22 fields:

class Demo(val field1: String,
    val field2: Int,
    // .. and so on ..
    val field23: String)

extends Product 
//For Spark it has to be Serializable
with Serializable {
    def canEqual(that: Any) = that.isInstanceOf[Demo]

    def productArity = 23 // number of columns

    def productElement(idx: Int) = idx match {
        case 0 => field1
        case 1 => field2
        // .. and so on ..
        case 22 => field23
    }
}