Geoff Reedy | 1 Jul 01:36 2011
Picon

Re: Re: Aw: Native Scala base classes to help porting Scala to other platforms

In my mind, the goal end state is to have no references to the java
package in core library subset and compiler sources.

I can see this happening in one of two ways right now:

1. Essentially remap the java APIs to some other package. For the JVM
   the implementation of the remapped APIs could be a bunch of package
   objects with type aliases and val forwarders.

   For other platforms there would be an implementation of the APIs
   written in Scala. The implementation would be split between things
   that can be implemented portably and those that require
   implementations specific to the target platform. For classes where
   only a portion of the methods are not portably implementable there
   would be a shared trait for the portable part extended by the
   platform specific implementation classes.

   Theoretically adding

     import _root_.{scala.platform => java}

   to the top of a source file would be sufficient to make it use these
   classes instead of those in the JRE except for the implicit import of
   java.lang._

2. Design a idomatic Scala platform API. Split the API into parts that
   can be implemented portably and those that can't just as above.

   In this case even the JVM target would require additional code to
   implement the platform API on top of JRE classes.
(Continue reading)

Miguel Garcia | 1 Jul 12:37 2011
Picon

Aw: Re: Aw: Re: Aw: Native Scala base classes to help porting Scala to other platforms



Sébastien, 
Geoff, 

(Not sure why Geoff's reply doesn't show up in Google groups yet, but my comments refer to it). 


A straightforward way to find out java.lang.* dependencies is the compiler option -Yno-imports
whose effect is "Compile without importing scala.*, java.lang.*, or Predef."


A few more details on that "-Yno-imports", it affects the following: 


  /** List of symbols to import from in a root context.  Typically that
   *  is java.lang, scala, and scala.Predef, in that order.  Exceptions:
   * 
   *  -- if -Yno-imports is given, nothing is imported
   *  -- if the unit is java defined, only java.lang is imported
   *  -- if -Yno-predef is given, if the unit has an import of Predef
   *     among its leading imports, or if the unit is scala.ScalaObject
   *     or scala.Predef, Predef is not imported.
   */
  protected def rootImports(unit: CompilationUnit, tree: Tree): List[Symbol] = {
    import definitions._
    assert(isDefinitionsInitialized, "definitions uninitialized")

    if (settings.noimports.value) Nil
    else if (unit.isJava) List(JavaLangPackage)
    else if (settings.nopredef.value || treeInfo.noPredefImportForUnit(unit.body)) List(JavaLangPackage, ScalaPackage)
    else List(JavaLangPackage, ScalaPackage, PredefModule)
  }



> 2. Design a idomatic Scala platform API. Split the API 
> into parts that can be implemented portably and those 
> that can't just as above.
> In this case even the JVM target would require additional code to
> implement the platform API on top of JRE classes.


I see. It's doable, it's for the common good, and so on. 


> I'm not sure that I understand what you mean by 
> whitelisting and book-keeping. By this do you mean generating 
> a list of java classes which are allowable in the compiler 
> and library and a process for enforcing the constraint?

What I had in mind is some automatic means to check whether platform dependencies had leaked into the codebase. 

A tool to help with this need not be fancy, -Yno-imports helps somewhat but a dedicated compiler plugin can do a better job. 
For example, to find out whether a callsite invokes JDK stuff one can test: 
  msym.owner.ownerChain contains JavaPackageClass
where 
   for(msym <- a.tpe.deferredMembers  . . . 
Other examples in 


That plugin would report sometimes good news, e.g. that java.util.Properties is used but in fact it's already pretty much cordoned to 
scala.sys.SystemProperties (thanks, Paul!) 

/** A bidirectional map wrapping the java System properties.
 *  Changes to System properties will be immediately visible in the map,
 *  and modifications made to the map will be immediately applied to the
 *  System properties.  If a security manager is in place which prevents
 *  the properties from being read or written, the AccessControlException
 *  will be caught and discarded.
 *
 *   <at> author Paul Phillips
 *   <at> version 2.9
 *   <at> since   2.9
 */
class SystemProperties extends mutable.Map[String, String] {



Miguel 

Simon Ochsenreither | 1 Jul 12:55 2011
Picon

Re: Aw: Native Scala base classes to help porting Scala to other platforms

Hi everyone,

one of the problems I'm currently looking at is numbers in all variations.

It seems like we basically have to extend ScalaNumber because it makes 
the class eligible for certain compiler hacks for comparing numbers with 
each other. But ScalaNumber extends java.lang.Number, which has all this 
legacy stuff like intValue/longValue/... which duplicates the Scala methods.

My plan is to ignore that for a moment and come up with cruft-free 
implementations, which might require some additional tuning to make them 
run on the various platforms, but come without the whole legacy baggage.

So I would have a look at things like the value types and BigInt, 
BigDecimal first. Implementing and getting these things right is hard, 
so I have no idea if I'm able to do it and how long it takes, but at 
least the have not too much connection to the underlying platform like 
the things in util.cocurrent, reflect, io, etc.

But generally, if Geoff or Sébastien need some class, this will have 
priority.

Is there a way to count the usage of various Java libraries in the Scala 
library? I had a short look with IntelliJ, but no substantial numbers.

Thanks and bye,

Simon

Tiark Rompf | 1 Jul 16:26 2011
Picon
Picon

Re: Aw: Native Scala base classes to help porting Scala to other platforms

I would like to stress the importance of not slowing down JVM code by adding levels of indirection.
- Tiark

On Jul 1, 2011, at 12:55 PM, Simon Ochsenreither wrote:

> Hi everyone,
> 
> one of the problems I'm currently looking at is numbers in all variations.
> 
> It seems like we basically have to extend ScalaNumber because it makes the class eligible for certain
compiler hacks for comparing numbers with each other. But ScalaNumber extends java.lang.Number, which
has all this legacy stuff like intValue/longValue/... which duplicates the Scala methods.
> 
> My plan is to ignore that for a moment and come up with cruft-free implementations, which might require
some additional tuning to make them run on the various platforms, but come without the whole legacy baggage.
> 
> So I would have a look at things like the value types and BigInt, BigDecimal first. Implementing and
getting these things right is hard, so I have no idea if I'm able to do it and how long it takes, but at least the
have not too much connection to the underlying platform like the things in util.cocurrent, reflect, io, etc.
> 
> But generally, if Geoff or Sébastien need some class, this will have priority.
> 
> Is there a way to count the usage of various Java libraries in the Scala library? I had a short look with
IntelliJ, but no substantial numbers.
> 
> Thanks and bye,
> 
> 
> Simon

Simon Ochsenreither | 1 Jul 17:17 2011
Picon

Re: Aw: Native Scala base classes to help porting Scala to other platforms

Hi Tiark,
> I would like to stress the importance of not slowing down JVM code by adding levels of indirection.
> - Tiark
afaiu no one has plans in that direction. I'm pretty sure Scala on the 
JVM will always depend on Java classes.

The discussion is mostly about what to do when Scala runs on a different 
platform.
A first step could be that the individual ports replace the the 
references to java.lang with the native Scala ones after the compiler ran.

Thanks and bye,

Simon

Miles Sabin | 1 Jul 18:14 2011

Suspicious-looking order dependency in implicit resolution

Hi folks,

I think this is probably a bug, but I thought I'd canvas opinions here
before opening a ticket.

Consider the following single compilation unit,

implicitorder.scala

object ImplicitConsumer {
  import ImplicitProvider._
  implicitly[String]
}

object ImplicitProvider {
  implicit def foo = "foo"
}

This compiles (2.9.0-1 and trunk) with the error,

implicitorder.scala:3: error: could not find implicit value for
parameter e: String
  implicitly[String]

We can fix that either by adding a type annotation to the definition
of foo or by moving the definition of ImplicitConsumer after the
definition of ImplicitProvider. That surprised me a little, because
I'd thought that requirement for explicit type annotations in the
forward-use case only applied within top-level definitions, not
between them. But whatever, not so big a deal.

But now take those same definitions and place them in two separate
compilation units,

implicitorder.scala

object ImplicitConsumer {
  import ImplicitProvider._
  implicitly[String]
}

implicitorder2.scala

object ImplicitProvider {
  implicit def foo = "foo"
}

And compile both together (again, 2.9.0-1 and trunk). This time no error.

On the face of it, either both of these should compile successfully or
both should fail. And I think both should compile successfully,
because where we're dealing with top-level definitions, the textual
order within a compilation unit shouldn't be significant.

Is this a bug? If it is, is it a new one? I have't been able to find
anything (open) in Jira that looks similar.

Cheers,

Miles

--

-- 
Miles Sabin
tel: +44 7813 944 528
gtalk: miles@...
skype: milessabin
http://www.chuusai.com/
http://twitter.com/milessabin

Alex Cruise | 1 Jul 18:22 2011

Re: Suspicious-looking order dependency in implicit resolution

On Fri, Jul 1, 2011 at 9:14 AM, Miles Sabin <miles <at> milessabin.com> wrote:
I think this is probably a bug, but I thought I'd canvas opinions here
before opening a ticket.

No doubt you're aware of it already, but https://issues.scala-lang.org/browse/SI-801?focusedCommentId=41419 is useful for many others who might encounter this situation. :)

-0xe1a
Miles Sabin | 1 Jul 18:28 2011

Re: Suspicious-looking order dependency in implicit resolution

On Fri, Jul 1, 2011 at 5:22 PM, Alex Cruise <alex@...> wrote:
> On Fri, Jul 1, 2011 at 9:14 AM, Miles Sabin <miles@...> wrote:
>>
>> I think this is probably a bug, but I thought I'd canvas opinions here
>> before opening a ticket.
>
> No doubt you're aware of it already,
> but https://issues.scala-lang.org/browse/SI-801?focusedCommentId=41419 is
> useful for many others who might encounter this situation. :)

Yes, but, like I said, I only expected that rule to operate within
top-level definitions, not between them.

Cheers,

Miles

--

-- 
Miles Sabin
tel: +44 7813 944 528
gtalk: miles@...
skype: milessabin
http://www.chuusai.com/
http://twitter.com/milessabin

Paul Phillips | 1 Jul 19:30 2011

Re: Suspicious-looking order dependency in implicit resolution

On 7/1/11 9:14 AM, Miles Sabin wrote:
> I think this is probably a bug, but I thought I'd canvas opinions here
> before opening a ticket.

To my knowledge the behavior has not been specified.  Attempting to reverse engineer the rule from the error message

"implicit method foo is not applicable here because it comes after the application point and it lacks an
explicit result type"

one could argue the behavior is consistent with that.  I think there will be reluctance to offer much in the
way of guarantees.  Here is some behavior one can currently witness.

// c.scala
object A {
  import B._

  implicitly[String]
  implicitly[Int]

  println(List(2): Set[Int])
  println(List("def"): Set[String])

  implicit def bippyc[T](x: List[T])(implicit p: T) = { println("bippyc converted " + x) ; x.toSet }
  implicit def bippy = 5
}

// c2.scala
object B {
  import A._

  implicitly[String]
  implicitly[Int]

  println(List(1): Set[Int])
  println(List("abc"): Set[String])

  implicit def fooc[T](x: Seq[T])(implicit p: String) = { println("fooc converted " + x) ; x.toSet }
  implicit def foo = "foo"
}

If we comment out "implicitly[Int]" in c.scala, then this compiles, which poses interesting questions
which I'll skip past:

% scalac c.scala c2.scala

Not this though:

% scalac c2.scala c.scala
c2.scala:4: error: could not find implicit value for parameter e: String
  implicitly[String]
            ^
c2.scala:8: error: could not find implicit value for parameter p: java.lang.String
  println(List("abc"): Set[String])
              ^
two errors found

Or, with implicitly[Int] still commented out, add an explicit "Set[T]" result type to bippyc in c.scala, and:

% scalac c.scala c2.scala
c.scala:7: error: could not find implicit value for parameter p: Int
  println(List(2): Set[Int])
              ^
one error found

I think one could do this all day.

Miles Sabin | 1 Jul 19:56 2011

Re: Suspicious-looking order dependency in implicit resolution

On Fri, Jul 1, 2011 at 6:30 PM, Paul Phillips <paulp@...> wrote:
> If we comment out "implicitly[Int]" in c.scala, then this compiles, which poses interesting questions
which I'll skip past:
>
> % scalac c.scala c2.scala
>
>
> Not this though:
>
> % scalac c2.scala c.scala

Oh, interesting. I'd tried swapping the order of the source files on
the command line to see if that was responsible for an implicit (sic)
textual ordering, but that didn't have any effect in my simpler
scenario.

I have to say that if the behaviour is as inscrutable as it seems to
be then it's really borderline reckless to ever allow the result type
of an implicit to be inferred ... which is annoying, to say the least.

Cheers,

Miles

--

-- 
Miles Sabin
tel: +44 7813 944 528
gtalk: miles@...
skype: milessabin
http://www.chuusai.com/
http://twitter.com/milessabin


Gmane