Roman Janusz | 27 Jun 03:05 2015
Picon

Sammy + wildcards + type inference = :(

Hello,

Is there any chance this will work?

$ scala -Xexperimental
Welcome to Scala version 2.11.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40).
Type in expressions to have them evaluated.
Type :help for more information.

scala
> new java.util.ArrayList[String]().stream.map(_.toInt)
<console>:8: error: no type parameters for method map: (x$1: java.util.function.Function[_ >: String, _ <: R])java.util.stream.Stream[R] exist so that it can be applied to arguments (java.util.function.Function[String,Int] with Serializable)
 
--- because ---
argument expression
's type is not compatible with formal parameter type;
 found   : java.util.function.Function[String,Int] with Serializable
 required: java.util.function.Function[_ >: String, _ <: ?R]
Note: String <: Any (and java.util.function.Function[String,Int] with Serializable <: java.util.function.Function[String,Int]), but Java-defined trait Function is invariant in type T.
You may wish to investigate a wildcard type such as `_ <: Any`. (SLS 3.2.10)
              new java.util.ArrayList[String]().stream.map(_.toInt)
                                                       ^
<console>:8: error: type mismatch;
 found   : java.util.function.Function[String,Int] with Serializable
 required: java.util.function.Function[_ >: String, _ <: R]
              new java.util.ArrayList[String]().stream.map(_.toInt)

It would be a very disappointing limitation if it won't - it would negate the most important promise of Sammy, which is good Java interop.

Thanks,
Roman

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

What is shadowing

Following Coursera and lecture example:

Why f is visible from within definition of 'result' and x is not?

Thanks

val x = 0

def f(y: Int) = y + 1

val result = {
val x = f(3) //f is externally defined
x * x // but x is internally defined, shadows the occurrence of the outer x
} + x

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Raoul Duke | 18 Jun 20:02 2015
Picon

re: LSP

> Regarding squares and rectangles--doing it the way you stated violates LSP.
> That there are oodles of OO tutorials that violate LSP with their very first
> example is depressing, though, and in a way is one of the most damning
> indictments of OO.  (Or maybe it's just a case of poor naming: real objects
> gain and lose properties willy-nilly, and our names for the various
> groupings violate LSP like crazy.

So LSP I thought was a nice idea, but not something that works out
well in reality due to the context-sensitive subjective weasel words,
"without altering any of the desirable properties of that program" :-)

--

-- 
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Jeremy Smith | 18 Jun 00:48 2015
Picon

When are scala macros expanded?

I am running into an issue with macros being expanded more eagerly than I expect.  I like to describe the actual problem first (rather than what I have morphed it into) in case I'm just approaching things in completely the wrong way to begin with, which is likely.

The underlying issue (which I had hoped this use of macros might solve, since this project was already macro-heavy) is that given a trait that I wish to implement (and don't control):

trait InterfaceTrait {
  def foo(str: String): String
  def bar(num: Int): String
}

I want my implementation of foo to require an implicit to be present if it is called.  You cannot simply write:

def foo(str: String)(implicit env: Something): String = ???

Because that wouldn't be considered an implementation of foo from InterfaceTrait.  If you try to do this:

def foo(str: String) = {
  val env = implicitly[Something]
  ...
}

or this:

def _foo(str: String)(implicit env: Something) = ???
def foo(str: String) = _foo(str)

Or any other trick which would materialize the implicit without it being a method parameter to foo itself - then the implicit Something is required immediately when the implementation is compiled, *whether or not my foo implementation is ever called*.  Obviously, if my implementation is part of a library, then this isn't good.  And if, as in my case, the implementation is created by a macro, then:
1. foo may never be called, in which case I don't want the implicit to be required (since it is never actually used) in order for the code using my library to compile
2. If foo is called, and the implicit is not present, I want it to fail at the call site of foo, rather than at the site of the macro expansion (or what would be the site of the implicitly[Something] call, if the class definition were not hidden by a macro expansion).

Since my project already uses macros, I thought maybe moving the implicit resolution into a macro could help solve this.  For example, the macro that creates the implementation could write foo as such:

def foo(str: String) = {
  Macros.provideTheImplicit { imp =>
    ...
  }
}

Where Macros.provideTheImplicit is itself a macro-backed function that resolves the implicit and executes the provided anonymous function, passing in the resolved implicit.

Finally, I get to the question: I had hoped (I'm not sure why) that Macros.provideTheImplicit {....} would get expanded by the compiler when a call to foo appears in code.  But instead, the call written by the expansion is itself expanded immediately, regardless of whether that code path will ever be reached.

I also tried making provideTheImplicit a method on a class which I instantiate, and or instantiate lazily.  Neither one did what I had hoped.

So the questions are:

1) Can I prevent a macro from being expanded if it never gets called?  This would be an interesting question regardless of whether or not it applies to my use case.
2) Is there some way I can pull an implicit, without it being a method parameter, but without the compiler freaking out until an actual call site occurs?  It doesn't have to be a macro solution; that's just the hammer I was working with at the time.

Thanks for any insight!
Jeremy

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Raoul Duke | 16 Jun 19:44 2015
Picon

ignorance: customizable immutables?

>   List(new A, new A{ override def foo = "different" }).map(x.asCapitalA)
> If `asCapitalA` has mutating side-effects, you retain the distinct `foo`
> behavior.  If, in contrast, it creates a new immutable A, your unique `foo`
> behavior is lost.

Are there any approaches people like to use to get immutables that
respect whatever customization has been done -- some way to assemble
components and then still be able to frob() them and get a new
immutable copy that retains the customizations? Concisely, robustly? I
mean in any programming language.

--

-- 
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Stefan Ollinger | 15 Jun 15:59 2015
Picon
Picon

[macros] untypecheck seems to produce invalid code

Hey,

See attached example. Is that a known bug or am I doing something wrong?

Regards,
Stefan

------

import scala.reflect.runtime.universe._
import scala.reflect.macros.blackbox.Context
import scala.language.experimental.macros
 
abstract class A {
val is: Int
}
 
def fooImpl(c: Context)(expr: c.Expr[Any]) = {
import c.universe._
 
println(expr)
// Expr[Nothing]({
// final class $anon extends $line22.$read.$iw.$iw.$iw.$iw.$iw.$iw.A {
// def <init>(): <$anon: A> = {
// $anon.super.<init>();
// ()
// };
// private[this] val is: Int = 100;
// override <stable> <accessor> def is: Int = $anon.this.is
// };
// new $anon()
// })
 
val untypedExpr = c.Expr[Any](c.untypecheck(expr.tree.duplicate))
 
println(untypedExpr) // this is invalid: override private[this] val is = 100;
// Expr[Any]({
// final class $anon extends $line22.$read.$iw.$iw.$iw.$iw.$iw.$iw.A {
// def <init>() = {
// super.<init>();
// ()
// };
// override private[this] val is = 100;
// override <stable> <accessor> def is: Int = $anon.this.is
// };
// new $anon()
// })
 
untypedExpr
}
 
def foo(expr: => Any): Any = macro fooImpl
 
// 1. this does not work
foo {
new A {
override val is = 100
}
}
 
// <console>:18: error: value is overrides nothing
// override val is = 100
 
// 2. this works
val tree = reify {
new A {
override val is = 100
}
}
 
// tree: reflect.runtime.universe.Expr[A] =
// Expr[A]({
// final class $anon extends $read.A {
// def <init>() = {
// super.<init>();
// ()
// };
// override val is = 100
// };
// new $anon()
// })

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Marconi | 11 Jun 12:42 2015
Picon

Announcing ScalaUpNorth, the first Scala conference in Canada. Toronto, Sept 25 & 26.

Scala Up North is the first and only Scala conference organized in Canada. Two days of awesome technical presentations about Scala and its vibrant ecosystem. Come connect with other Scala developers and the companies building their mission-critical applications in Scala. It's like maple syrup for your cake.


Are you not in Canada? Worry not, Toronto in late September is not so cold. Not that much. Hopefully.


Community-oriented, developer-centric: Registered attendees vote to select the presentations they would like to see in the final program with minimal input from the organizers.

Not-for-profit conference: Our financial goal is to break even, every dollar we make will go into improving the conference.


Registration and CFP now open: http://scalaupnorth.com/


Follow us on Twitter: https://twitter.com/ScalaUpNorth

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Steven Dobay | 9 Jun 09:55 2015
Picon

pocket-sql - generating and running sql queries.

Hi all,
  I've started to develop this and if there will be any interest in it I'll enhance it and make a release in the future. Any thoughts?

 Regards, Steven.

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Tran Tomas | 1 Jun 23:19 2015
Picon

Scala multi dimensional data structure

Hi,

I am new in SCALA and i would like to ask you if i have this situation when you have for example a book 
      -which has X chapters
      -each chapter has Y pages
      -each page has Z words

How would a MOST EFFICIENT implementation be ? 

Thanks in advance


--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Animesh Pandey | 28 May 23:08 2015
Picon

How should I serialize a JSON file in Scala?

I am trying to read a bunch of text files which have documents. I convert each document into a JSON object. I want to save JSON to a single file. How can I do that?

I tried the following code:
package helpers

import java.io.{FileOutputStream, ObjectOutputStream, PrintWriter, File}
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkContext, SparkConf}
import play.api.libs.json._
import scala.collection.mutable.ListBuffer
import scala.collection.JavaConversions._
import org.elasticsearch.hadoop.mr.EsOutputFormat
import org.elasticsearch.hadoop.mr.EsInputFormat
import org.elasticsearch.hadoop.cfg.ConfigurationOptions
import org.apache.hadoop.mapred.{FileOutputCommitter, FileOutputFormat, JobConf, OutputFormat}
import org.apache.hadoop.fs.Path
import org.apache.hadoop.io.{MapWritable, Text, NullWritable}
import org.elasticsearch.spark._

class esData(var id: String, var text: String) extends Serializable {
 
var es_json: JsValue = Json.obj()
  es_json
= Json.obj(
   
"_index" -> "ES_SPARK_AP",
   
"_type" -> "document",
   
"_id" -> id,
   
"_source" -> Json.obj(
     
"text" -> text
   
)
 
)
  val oos
= new ObjectOutputStream(new FileOutputStream("/home/test.json"))
  oos
.writeObject(es_json)
  oos
.close()
}

class trySerialize {
 
def bar() {
   
var es_json: JsValue = Json.obj()
    es_json
= Json.obj(
     
"_index" -> "ES_SPARK_AP",
     
"_type" -> "document",
     
"_id" -> "12345",
     
"_source" -> Json.obj(
       
"text" -> "Eureka!"
     
)
   
)
    println
(es_json)
 
}
 
def foo() {
    val conf
= new SparkConf()
     
.setAppName("linkin_spark")
     
.setMaster("local[2]")
     
.set("spark.executor.memory", "1g")
     
.set("spark.rdd.compress", "true")
     
.set("spark.storage.memoryFraction", "1")
    val sc
= new SparkContext(conf)

    val writer
= new PrintWriter(new File("/home/test.json"))
   
for (i <- 1 to 10) {
      val es_json
= new esData(i.toString(), "Eureka!")
      println
(es_json)
     
//writer.write(es_json.toString() + "\n")
   
}
    writer
.close()
 
}
}

class jsonSerialize() {
 
def readDocumentData() {
    val conf
= new SparkConf()
     
.setAppName("linkin_spark")
     
.setMaster("local[2]")
     
.set("spark.executor.memory", "1g")
     
.set("spark.rdd.compress", "true")
     
.set("spark.storage.memoryFraction", "1")
     
.set("es.index.auto.create", "true")

    val sc
= new SparkContext(conf)
    val sqlContext
= new SQLContext(sc)
   
import sqlContext._

    val temp
= sc.wholeTextFiles("/home/ap890101")
    val docStartRegex
= """<DOC>""".r
    val docEndRegex
= """</DOC>""".r
    val docTextStartRegex
= """<TEXT>""".r
    val docTextEndRegex
= """</TEXT>""".r
    val docnoRegex
= """<DOCNO>(.*?)</DOCNO>""".r
    val writer
= new PrintWriter(new File("/home/test.json"))

   
for (fileData <- temp) {
      val filename
= fileData._1
      val content
: String = fileData._2
      println
(s"For $filename, the data is:")
     
var startDoc = false // This is for the
     
var endDoc = false // whole file
     
var startText = false //
     
var endText = false //
     
var textChunk = new ListBuffer[String]()
     
var docID: String = ""
     
var es_json: JsValue = Json.obj()

     
//val results: Iterator[JsValue] =
     
for (current_line <- content.lines) {
        current_line match
{
         
case docStartRegex(_*) => {
            startDoc
= true
            endText
= false
            endDoc
= false
         
}
         
case docnoRegex(group) => {
            docID
= group.trim
         
}
         
case docTextStartRegex(_*) => {
            startText
= true
         
}
         
case docTextEndRegex(_*) => {
            endText
= true
            startText
= false
         
}
         
case docEndRegex(_*) => {
            endDoc
= true
            startDoc
= false
            es_json
= Json.obj(
             
"_id" -> docID,
             
"_source" -> Json.obj(
               
"text" -> textChunk.mkString(" ")
             
)
           
)
           
//val es_json = new esData(docID, textChunk.mkString(" "))
           
//writer.write(s"json") // throws error
           
//sc.makeRDD(Seq(Json.stringify(es_json))).saveToEs("ES_SPARK_AP/document") // throws error
           
//val input = jsonFile(Json.stringify(es_json)) // throws error
            //writer.write(Json.stringify(es_json)) // throws error
            println
(es_json)
           
//input.printSchema()
           
//println(input.schema)
            textChunk
.clear()
         
}
         
case _ => {
           
if (startDoc && !endDoc && startText) {
              textChunk
+= current_line.trim
           
}
         
}
       
}
     
}
   
}
    writer
.close()
 
}
}

object Main2 {
 
def main(args: Array[String]) {
    val obj
= new trySerialize()
    val obj2
= new jsonSerialize()
   
//obj.foo()
    obj2
.readDocumentData()
 
}
}

I tried serialsing it this way but I cannot make the JSON go into a file. I even created a new class esData that is Serializable but still I get the same error. The function foo() works with or without Serializable but readDocumentData does not work? I cannot understand what to do? The document that I am using for testing is here: https://www.dropbox.com/s/gtgxhqqy1ngi7ok/ap890101?dl=0

Please advice.

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Johannes Rudolph | 21 May 16:57 2015

Factors that prevent implicit conversions to be found

Hi,

in the akka-http/spray routing DSL there's a case where an implicit
conversion isn't found in some cases (mostly deterministic per compile
run) for this line:

parameter('color)

where parameter's argument should be of type `ParamMagnet` and the
implicit conversion is defined in ParamMagnet's companion object.

There's quite some logic involved behind the scenes to convert a
Symbol to a ParamMagnet implicit. The strange thing is that you can
make it compile by adding a line just mentioning the type
`ParamMagnet` above the problematic line:

e.g.

type X = akka.http.scaladsl.server.directives.ParameterDirectives.ParamMagnet
parameter('color)

or

val x: akka.http.scaladsl.server.directives.ParameterDirectives.ParamMagnet
= null
parameter('color)

Adding the line below the problematic one doesn't help. I know it's
not the first time I had such an issue but I cannot remember if I
found a solution or not before.

`-Xlog-implicits` doesn't show anything. The compilation problem
occurs with Scala 2.10.x.

Any ideas?

Thanks,
Johannes

--

-- 
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Gmane