Johannes Rudolph | 21 May 16:57 2015

Factors that prevent implicit conversions to be found

Hi,

in the akka-http/spray routing DSL there's a case where an implicit
conversion isn't found in some cases (mostly deterministic per compile
run) for this line:

parameter('color)

where parameter's argument should be of type `ParamMagnet` and the
implicit conversion is defined in ParamMagnet's companion object.

There's quite some logic involved behind the scenes to convert a
Symbol to a ParamMagnet implicit. The strange thing is that you can
make it compile by adding a line just mentioning the type
`ParamMagnet` above the problematic line:

e.g.

type X = akka.http.scaladsl.server.directives.ParameterDirectives.ParamMagnet
parameter('color)

or

val x: akka.http.scaladsl.server.directives.ParameterDirectives.ParamMagnet
= null
parameter('color)

Adding the line below the problematic one doesn't help. I know it's
not the first time I had such an issue but I cannot remember if I
found a solution or not before.
(Continue reading)

lomo hany | 13 May 22:30 2015
Picon

[scala-language] Fwd: Us congress hearing of maan alsaan Money laundry قضية الكونغجرس لغسيل الأموال للمليادير معن الصانع




 

YouTube videos of

 

 U.S. Congress money laundering hearing


of

Saudi Billionaire  " Maan  Al sanea"

 with bank of America


and  The  owner of Saad Hospital and  Schools

 in the Eastern Province in Saudi Arabia

 

and the Chairman of the Board of Directors of Awal Bank  in Bahrain


With Arabic Subtitles

 

 

موقع اليوتيوب الذي عرض جلسة استماع الكونجرس الأمريكي

 لمتابعة نشاطات غسل الأموال ونشاطات

 

السعودي معن عبدالواحد الصانع

 

مالك مستشفى  وشركة سعد  ومدارس سعد بالمنطقة الشرقية بالسعودية   ورئيس مجلس ادارة بنك اوال البحريني

 

مترجم باللغة العربية

 

http://www.youtube.com/watch?v=mIBNnQvhU8s

 











































































--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Christoph Neijenhuis | 12 May 17:34 2015

Failure should probably extend Try[Nothing], as shown in first lecture of Principles of Reactive Programming

In the lecture Monads (slide 22), Try with Success and Failure was introduced. In particular, Failure was defined as:

case class Failure(ex: Exception) extends Try[Nothing]

However, in scala.util it is actually defined this way:

final case class Failure[+T](exception: Throwable) extends Try[T]

I tried to figure out why Failure would remain a generic class while e.g. None doesn't, but I couldn't come up with an explanation. In fact, I'd argue Failure, like None, should use the bottom type for two reasons:

1. When dealing with a Failure, one doesn't have to worry about a generic type that doesn't make any difference anyway. E.g. when pattern matching:

case None => ... // Compiles
case Failure => ... // Does not compile: "pattern type is incompatible with expected type"
case Failure[_] => ... // Compiles, but is unintuitive when working with None previously

Or when passing a failure along, one has to cast to the "correct" generic type (seen in the implementation of Failure itself, but this example is from the implementation of Future)

case f: Failure[_] => p complete f.asInstanceOf[Failure[(T, U)]]

When using the bottom type, this simply becomes:

case f: Failure => p complete f

2. The method signatures and the generated scaladoc are more obvious. E.g. the get method of None is defined as:

def get: Nothing

whereas the get method of Failure is:

def get: T

I'd argue in the case of None, it's much easier to figure out one shouldn't use the get method based on the signature.


I did go forward and changed the implementation of Failure: https://github.com/cneijenhuis/scala/commit/e03c6bda9d6c92b764f540278d718098e7778791#diff-a2cc47b875d07181ae9e71681fb3f07dL211
But the resulting class is obviously not compatible with the previous version. I fixed the resulting errors in Future and JavapClass as well - these changes also show nicely why I think it's benificial to use Nothing.

Anyway, my questions are:
Is there a reason I missed why Failure should really be a generic class and not extend Try[Nothing]?
If not - is there any chance I can submit a pull request with this? After all, it's a breaking change... but I do see those are scheduled for "Aida", and, while Try isn't a collection, this change would fit with the theme of "we want to make them even easier to use" :-)

Best,
Christoph

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Alex Ivanov | 12 May 09:13 2015
Picon

[scala-language] The cake’s problem, dotty design and the approach to modularity.

Good morning everyone.

I’m very interested in the topic of modularity in Scala. It’s not a secret that today The Cake Pattern is the most popular way to achieve modularity in the project and, to some extent, a nice way to manage dependencies. On the other hand this way of structuring the application becomes more and more an anti-pattern, but unfortunately there’s not much explanation why. I feel that it’s pretty hard to answer why it’s becomes an anti-pattern without some big, active project (like Scala compiler), but in general i can think of things like - complicated design, not that nice way to manage dependencies, problems with binary compatibility, slower compilation due to inheritance overuse and complex cyclic dependencies, any other issues that i didn’t think of? 

I remember that not so long ago Prof. Odersky mentioned he was also disappointed in the pattern and went a bit different way with a Dotty design. But, unfortunately, i couldn’t find details on this decision, what were the problems and how they were solved with a new design (would be good to read a paper like Scalable Component Abstractions). Anyway the project is open-source, so it’s not hard to take a look and see that there are some differences in the design, some components are packed within modules/objects and it uses less cake-style coupling (beside the Context structure). Could someone please unveil the secrets of the dotty architectural design?

And, i guess, that last question that bothers me, what to use instead of the Cake? In the talk “Scala - the simple parts” i liked the idea on the Scala’s Modular Roots where we can see the mapping between Scala and SML languages. Can’t reason much on this, i’ve just started looking at what we can achieve with this SML-style modularity (maybe someone could explain this as well =)? ), but it looks like a better approach to the modular design, especially in combination with typeclasses. Of course we should “use the right tool for problem”, but it feels like the Cake Pattern, is not good as the essential approach to the application architecture and modularity. 

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Andrew Phillips | 8 May 03:21 2015
Picon

Re: Re: Strange type error

Hi Toby

> So, at the time the compiler needs to decide which part of the code constitutes an anonymous function, it cannot 
> yet take into account, what type is required. It needs to go by syntax only, and that leaves few options besides 
> limiting it to simple expressions.

Just to add to Oliver's answer: it's not explicitly stated in the spec (6.23) in this way, but what works for me as a guideline is to regard the scope of the anonymous function that the compiler creates as extending to the innermost enclosing expression containing the '_' placeholder character.

So _ + _* _ is desugared to (x, y, z) => x + y * z because the innermost enclosing expression is the whole expression. In the case of _ + (_ * _), however, the innermost enclosing expression for the latter two placeholders is (_ * _), so the compiler first expands that to an anonymous function _ + ((y, z) => y * z) and then does that again for the first placeholder.

This also explains why e.g.

List(1, 2).map { i => println("Hi"); i + 1 } and
List(1, 2).map { println("Hi"); _ + 1 }

behave differently [1]: the innermost enclosing expression for the placeholder in the second statement is _ + 1, so the result is effectively

List(1, 2).map { println("Hi"); i => i + 1 }

Regards

ap

[1] http://scalapuzzlers.com/#pzzlr-001

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Toby | 7 May 17:15 2015

Re: Re: Strange type error

Ok, so in essence when building the tree the compiler does not yet realise that it requires a three-parameter function at that point and hence follows the 'default approach' for combining low-level stuff into higher level stuff. The brackets change what that default is.

I can't say that I understand the why any better, but I suspect that that is not a 'user-level' issue. Sometimes, there is just too much sugar!

On Thursday, 7 May 2015 16:50:35 UTC+2, Oliver Ruebenacker wrote:

     Hello,

  First, the compiler needs to turn the source code into a syntax tree. Only after that, it assigns types to nodes of the tree.

  So, at the time the compiler needs to decide which part of the code constitutes an anonymous function, it cannot yet take into account, what type is required. It needs to go by syntax only, and that leaves few options besides limiting it to simple expressions.

  Something like (_ + _) * 3 is not necessarily illegal, there may be an implicit conversion of a function into something that has a * operator.

     Best, Oliver

On Thu, May 7, 2015 at 10:31 AM, Toby <to... <at> meliorbis.com> wrote:
Thank you Janusz and Phillips for your answers.

However, you have told me what happens but not why it happens. I am hence a little wiser but no more enlightened. 

Why does the compiler decide that the presence of brackets requires nested functions, rather than one function with multiple parameters and precedence? The unique type that would satisfy the requirements of the function being called is a function of three Double parameters. This makes no sense to me. 

On Thursday, 7 May 2015 13:10:28 UTC+2, Andrew Phillips wrote:
> but with additional brackets to change precedence, (_ + _) * _ fails

Just for completeness: as you probably also discovered, it also fails if you don't affect the precedence, and simply add brackets for e.g. clarity:

scala> execute( _ + (_ * _) )
<console>:9: error: missing parameter type for expanded function ((x$1) => x$1.$
plus(((x$2, x$3) => x$2.$times(x$3))))
              execute( _ + (_ * _) )
                       ^

Regards

ap

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-languag... <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Oliver Ruebenacker
Solutions Architect at Altisource Labs
Be always grateful, but never satisfied.

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Toby | 7 May 11:30 2015

Strange type error

Dear Scala Team,

See below for a simple example that uses placeholder syntax, but fails to compile. Essentially, execute accepts a ternary Double operator. In the first example, where the function literal is _ + _ * _ the compile accepts this, but with additional brackets to change precedence, (_ + _) * _ fails.

I'm sure there is a straightforward explanation for this, but I am currently at a lost as to what it is. Any insights would be much appreciated.

Thanks,

Tobias Grasl

object TestTernaryOp {

 

 def execute(fn : (Double, Double, Double) => Double) = {

   fn(1,2,3)

 }


  def main( args : Array[String]) =  {

    // Compiles fine
    println(execute( _ + _ * _ ))

    // Does not compile - type inference error
    println(execute( (_ + _) * _ ))
  }

}



--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Rado Buranský | 6 May 21:25 2015
Picon

AbstractTraversable.headOption throws NoSuchElementException

The subject is not really correct, but I'd like to know you're opinion on how to avoid this kind of issue:

import scala.collection.JavaConversions._

val files: java.lang.Iterable[...] = ...
files
.headOption match { ... // This throws NoSuchElementException

If you're curious the stack trace looked like this:
...
Caused by: java.util.NoSuchElementException
at com.google.common.collect.AbstractIterator.next(AbstractIterator.java:152)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:42)
at scala.collection.IterableLike$class.head(IterableLike.scala:91)
at scala.collection.AbstractIterable.head(Iterable.scala:54)
at scala.collection.TraversableLike$class.headOption(TraversableLike.scala:436)
at scala.collection.AbstractTraversable.headOption(Traversable.scala:105)
...

The reason is that the implementation of the Iterable can be iterated only once. But it took me a while to find it out. I simply called an API (SonarQube) to get a collection of files and I naturally didn't care about implementation of the Iterable interface. Should I? Who's fault is this and how to avoid it?

Thanks for opinions

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
scalanewbie | 2 May 12:57 2015
Picon

var vs def

Hello All,

I am a newbie to scala and getting in grasp with val and def.

I noticed that I could write :

scala> var greatWorld2 = println("great world")

great world

greatWorld2: Unit = ()


AND


scala> def greatWorld() = println("great world")

greatWorld: ()Unit


Can someone please help me understand the difference?


I am aware that var is generally used to define variables and def is used to define functions. 


Now, keeping in mind that greatWorld2 is actually a variable and not a function, I am trying to print it, and I get:


scala> greatWorld2


OR, if I try calling it as a function, it throws an error e.g. below (which sounds ok as I did not declare this as a function, so was expecting this anyway).


scala> greatWorld2()

<console>:9: error: Unit does not take parameters

              greatWorld2()



 Long story short, difference between var and def , in terms of what the interpreter and or compiler is doing underneath will be very good to understand.


Thanks Guys,

S



--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Zhu Tan | 30 Apr 17:06 2015
Picon

updates on 2.12 release date?

Hi, 
I am under pressure to upgrade our code base to Java 8, which mean to upgrade to Scala 2.12.  The road-map post on the scala-lang is in 2014. 
Anyone knows any update on the release date of 2.12?  
Many thanks, 
Zhu

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Roman Janusz | 14 Apr 23:46 2015
Picon

Compiler plugin for warning suppression

Hello,

Recently, inspired by the Towards a Safer Scala talk from this year's ScalaDays, I wanted to try out the set of scalac options recommended by the speaker:

scalacOptions ++= Seq(
 
"-Xlint",
 
"-Xdeprecation",  
 
"-Xfatal-warnings"
)

Unfortunately, almost immediately I ran into a show-stopper: Scala has no warning suppression similar to Java's <at> SuppressWarnings. I googled around and unpleasantly found out that requests for this feature have been rejected. See for example SI-1781.

But I realized that writing a compiler plugin which could fill that gap would not be that hard.

So here it is: silencer

It would be nice to have some feedback from you, Scala users, to see if there is any interest in maintaining such plugin. The current proof-of-concept implementation is really simple and small, but uses internal scalac API, so I guess there's no guarantee that it won't break with some new Scala version. That is also why I think some community support would help a lot.

Cheers,
Roman

--
You received this message because you are subscribed to the Google Groups "scala-language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-language+unsubscribe <at> googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Gmane