- About Scala
- Documentation
- Code Examples
- Software
- Scala Developers
scala-macros vs scala-virtualized
Tue, 2012-02-14, 16:23
I am far from a Scala expert and I could be wrong, but to my naive eyes
it looks like two large projects are entering Scala which both
accomplish pretty much the same thing (or at least overlap to a very
large extent):
scala-macros and scala-virtualized
They are both mechanisms that allow you to write normal-looking Scala
code which does something else in the background, like generating SQL
queries and what not. In other words, both scala-macros and
scala-virtualized are mechanisms for creating domain specific languages
in Scala.
Having two distinct and complex mechanisms for doing essentially the
same thing seems wasteful and redundant. Not to mention the possible
unintended interactions between two such complex features.
Now, if you accept that the two mechanisms are somewhat redundant (which
you might disagree with of course), the question turns to which one is
better for Scala programmers...
1) Scala-virtualized is much more in the spirit of Scala
Even before scala-virtualized you could implement e.g. flatMap() in your
classes and have them participate in Scala's for/yield protocol, or you
could override various operators and such. Using methods to override
behavior of Scala's builtin constructs is nothing new. Scala-virtualized
merely expands that existing precedent to all language constructs and
brings it to its logical conclusion.
On the other hand, Scala had nothing like scala-macros before. It's a
completely new concept and somewhat alien in the Scala context. I've
used macros in Lisp to great effect, but Lisp is a language that is very
amenable to macros because Lisp code and data look the same. Scala code
and data look nothing alike, which brings me to the second point...
2) Scala-virtualized is much more intuitive to use for the library writer
With scala-virtualized you write your DSL implementations in
straightforward Scala code, similar to how you would write e.g. implicit
conversions to pimp a library or things of that nature.
With scala-macros you are forced to write this weird AST meta-language
that looks nothing like normal Scala code. It's almost like having to
learn another language on top of Scala.
3) How do you even debug scala-macros?
In scala-virtualized the overridden behavior sits in normal methods that
a debugger could interactively step through at run time, whenever you
hit an overridden construct as you step through your DSL.
With scala-macros, the code inside the macro is executed in some
precompilation phase before the code runs. When you actually run the
program it's no longer the macro code that gets executed, it's the
*result* produced by the AST manipulations in the macro code. So when
stepping interactively with a debugger you can't really step through the
original macro code that generated the code you are currently running.
This makes finding bugs in the macros much more difficult as your macro
codebase grows.
4) Scala-macros might be more powerful
Not to be completely one-sided, I will concede that scala-macros might
be strictly more powerful than scala-virtualized. I have no formal proof
of this, but it sounds like rewriting/generating ASTs from scala-macros
might enable some crazy things that might not be possible (or easy) in
scala-virtualized. But are these corner cases where scala-macros is more
powerful than scala-virtualized really something that you would often
use in practice?
In summary, compared to scala-virtualized, scala-macros look like a
low-level hackish solution in search of a problem. They are complex to
write, un-scalaish, difficult to debug, and will probably just add to
Scala's image of "too much complex inscrutable magic".
Finally, my question to the Scala Community is:
What *practical* problems can you solve with scala-macros that you
couldn't solve just as easily with scala-virtualized?
Tue, 2012-02-14, 16:41
#1
Re: scala-macros vs scala-virtualized
I'm not equipped to answer the question, but I applaud the asking of it.
Tue, 2012-02-14, 16:51
#2
Re: scala-macros vs scala-virtualized
Great question.
I at least hope that whatever will be integrated in the future will make it possible to provide a better, type-provider-like way to access databases and other storages in a fully typed way. I think that is the killer feature which would spell out a huge advantage to a lot of people in Scala space.
I at least hope that whatever will be integrated in the future will make it possible to provide a better, type-provider-like way to access databases and other storages in a fully typed way. I think that is the killer feature which would spell out a huge advantage to a lot of people in Scala space.
Tue, 2012-02-14, 16:51
#3
Re: scala-macros vs scala-virtualized
... in the Java space. Sorry :-)
Tue, 2012-02-14, 17:01
#4
Re: scala-macros vs scala-virtualized
Hi Ivan,
On Tue, Feb 14, 2012 at 04:23:18PM +0100, Ivan Todoroski wrote:
> What *practical* problems can you solve with scala-macros that you
> couldn't solve just as easily with scala-virtualized?
Like you, I am not an expert at either the work on macros or on
virtualization. But given that macros are essentially "mini compiler
plugins" which do tree transformations there are some problems I face
which I bet they can solve.
The big one is optimizing away intermediate objects which are created
through use of the implicit enrichment pattern, but which aren't
needed. I talked about this a bit on the scala-sips mailing list
recently but here's a summary in code form:
import scala.{specialized => spec}
import com.azavea.math.Numeric
import com.azavea.math.FastImplicits._
// code user writes
def foo1[@spec A:Numeric](x:A, y:A) = x + y
// code with sugar removed
def foo2[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = x + y
// code after implicit resolution
def foo3[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = infixOps(x)(ev).+(y)
// code with infixOps method inlined
def foo4[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = new FastNumericOps(x)(ev).+(y)
// code i wish could get generated, basically inlining the
// implementation of FastNumericOps#+ without creating a new object.
def foo5[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = ev.plus(x, y)
This might seem a bit esoteric but this pattern also gets used with
Ordering and other type classes. It's really the only performance
problem left with this pattern, after specialization.
There are other cases too (until recently, for-loops) where after
profiling you can identify certain places where a (small) tree
transformation will yield huge gains. Virtualization may help in the
case of loops, but my sense is that it isn't designed to handle this
general class of problem.
Tue, 2012-02-14, 17:01
#5
Re: scala-macros vs scala-virtualized
Whereas you see them as accomplishing the same thing, I see them as
accomplishing very different things.
As you, I might be wrong of course. I see scala-virtualized as
generating code at run-time, whereas scala-macros generates code as
compile-time. They are both code generators, which, together with
string interpolators and virtpatmat makes it four different new code
generators in the next version of Scala!
Some use cases particular to macros are ad-hoc performance
optimizations, type-safe evaluation of strings (eg: formatting
strings, regex, json, and even xml if its literals are ever replaced
with string interpolation), and generation of code that needs to be
present as libraries.
On Tue, Feb 14, 2012 at 13:23, Ivan Todoroski wrote:
> I am far from a Scala expert and I could be wrong, but to my naive eyes it
> looks like two large projects are entering Scala which both accomplish
> pretty much the same thing (or at least overlap to a very large extent):
>
> scala-macros and scala-virtualized
>
> They are both mechanisms that allow you to write normal-looking Scala code
> which does something else in the background, like generating SQL queries and
> what not. In other words, both scala-macros and scala-virtualized are
> mechanisms for creating domain specific languages in Scala.
>
> Having two distinct and complex mechanisms for doing essentially the same
> thing seems wasteful and redundant. Not to mention the possible unintended
> interactions between two such complex features.
>
> Now, if you accept that the two mechanisms are somewhat redundant (which you
> might disagree with of course), the question turns to which one is better
> for Scala programmers...
>
>
> 1) Scala-virtualized is much more in the spirit of Scala
>
> Even before scala-virtualized you could implement e.g. flatMap() in your
> classes and have them participate in Scala's for/yield protocol, or you
> could override various operators and such. Using methods to override
> behavior of Scala's builtin constructs is nothing new. Scala-virtualized
> merely expands that existing precedent to all language constructs and brings
> it to its logical conclusion.
>
> On the other hand, Scala had nothing like scala-macros before. It's a
> completely new concept and somewhat alien in the Scala context. I've used
> macros in Lisp to great effect, but Lisp is a language that is very amenable
> to macros because Lisp code and data look the same. Scala code and data look
> nothing alike, which brings me to the second point...
>
>
> 2) Scala-virtualized is much more intuitive to use for the library writer
>
> With scala-virtualized you write your DSL implementations in straightforward
> Scala code, similar to how you would write e.g. implicit conversions to pimp
> a library or things of that nature.
>
> With scala-macros you are forced to write this weird AST meta-language that
> looks nothing like normal Scala code. It's almost like having to learn
> another language on top of Scala.
>
>
> 3) How do you even debug scala-macros?
>
> In scala-virtualized the overridden behavior sits in normal methods that a
> debugger could interactively step through at run time, whenever you hit an
> overridden construct as you step through your DSL.
>
> With scala-macros, the code inside the macro is executed in some
> precompilation phase before the code runs. When you actually run the program
> it's no longer the macro code that gets executed, it's the *result* produced
> by the AST manipulations in the macro code. So when stepping interactively
> with a debugger you can't really step through the original macro code that
> generated the code you are currently running. This makes finding bugs in the
> macros much more difficult as your macro codebase grows.
>
>
> 4) Scala-macros might be more powerful
>
> Not to be completely one-sided, I will concede that scala-macros might be
> strictly more powerful than scala-virtualized. I have no formal proof of
> this, but it sounds like rewriting/generating ASTs from scala-macros might
> enable some crazy things that might not be possible (or easy) in
> scala-virtualized. But are these corner cases where scala-macros is more
> powerful than scala-virtualized really something that you would often use in
> practice?
>
>
> In summary, compared to scala-virtualized, scala-macros look like a
> low-level hackish solution in search of a problem. They are complex to
> write, un-scalaish, difficult to debug, and will probably just add to
> Scala's image of "too much complex inscrutable magic".
>
>
> Finally, my question to the Scala Community is:
>
> What *practical* problems can you solve with scala-macros that you couldn't
> solve just as easily with scala-virtualized?
Tue, 2012-02-14, 17:01
#6
Re: scala-macros vs scala-virtualized
I was wondering too but I think the answer is that scala virtualized is limited to expressions (and thus does not allow for generating class declarations for instance).
On Tue, Feb 14, 2012 at 16:23, Ivan Todoroski <grnch_lists@gmx.net> wrote:
On Tue, Feb 14, 2012 at 16:23, Ivan Todoroski <grnch_lists@gmx.net> wrote:
I am far from a Scala expert and I could be wrong, but to my naive eyes it looks like two large projects are entering Scala which both accomplish pretty much the same thing (or at least overlap to a very large extent):
scala-macros and scala-virtualized
They are both mechanisms that allow you to write normal-looking Scala code which does something else in the background, like generating SQL queries and what not. In other words, both scala-macros and scala-virtualized are mechanisms for creating domain specific languages in Scala.
Having two distinct and complex mechanisms for doing essentially the same thing seems wasteful and redundant. Not to mention the possible unintended interactions between two such complex features.
Now, if you accept that the two mechanisms are somewhat redundant (which you might disagree with of course), the question turns to which one is better for Scala programmers...
1) Scala-virtualized is much more in the spirit of Scala
Even before scala-virtualized you could implement e.g. flatMap() in your classes and have them participate in Scala's for/yield protocol, or you could override various operators and such. Using methods to override behavior of Scala's builtin constructs is nothing new. Scala-virtualized merely expands that existing precedent to all language constructs and brings it to its logical conclusion.
On the other hand, Scala had nothing like scala-macros before. It's a completely new concept and somewhat alien in the Scala context. I've used macros in Lisp to great effect, but Lisp is a language that is very amenable to macros because Lisp code and data look the same. Scala code and data look nothing alike, which brings me to the second point...
2) Scala-virtualized is much more intuitive to use for the library writer
With scala-virtualized you write your DSL implementations in straightforward Scala code, similar to how you would write e.g. implicit conversions to pimp a library or things of that nature.
With scala-macros you are forced to write this weird AST meta-language that looks nothing like normal Scala code. It's almost like having to learn another language on top of Scala.
3) How do you even debug scala-macros?
In scala-virtualized the overridden behavior sits in normal methods that a debugger could interactively step through at run time, whenever you hit an overridden construct as you step through your DSL.
With scala-macros, the code inside the macro is executed in some precompilation phase before the code runs. When you actually run the program it's no longer the macro code that gets executed, it's the *result* produced by the AST manipulations in the macro code. So when stepping interactively with a debugger you can't really step through the original macro code that generated the code you are currently running. This makes finding bugs in the macros much more difficult as your macro codebase grows.
4) Scala-macros might be more powerful
Not to be completely one-sided, I will concede that scala-macros might be strictly more powerful than scala-virtualized. I have no formal proof of this, but it sounds like rewriting/generating ASTs from scala-macros might enable some crazy things that might not be possible (or easy) in scala-virtualized. But are these corner cases where scala-macros is more powerful than scala-virtualized really something that you would often use in practice?
In summary, compared to scala-virtualized, scala-macros look like a low-level hackish solution in search of a problem. They are complex to write, un-scalaish, difficult to debug, and will probably just add to Scala's image of "too much complex inscrutable magic".
Finally, my question to the Scala Community is:
What *practical* problems can you solve with scala-macros that you couldn't solve just as easily with scala-virtualized?
Tue, 2012-02-14, 17:11
#7
Re: scala-macros vs scala-virtualized
On 14/02/2012 16:36, Paul Phillips wrote:
> I'm not equipped to answer the question, but I applaud the asking of it.
>
That is also something found recently in discussion on Scala User Groups
(well, at least PSUG), where user start to ask when Scala will start to
*remove* features, not add (huge) one like this - moreover, two as big
as that, and kind of far from the orthogonal-feature set that Scala
adverts (you know, "deep not broad" - well, it will become hard to
explain what is not broad in Scala)
Thanks,
Tue, 2012-02-14, 17:21
#8
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 13:46, Francois wrote:
> On 14/02/2012 16:36, Paul Phillips wrote:
>>
>> I'm not equipped to answer the question, but I applaud the asking of it.
>>
>
>
> That is also something found recently in discussion on Scala User Groups
> (well, at least PSUG), where user start to ask when Scala will start to
> *remove* features, not add (huge) one like this - moreover, two as big as
> that, and kind of far from the orthogonal-feature set that Scala adverts
> (you know, "deep not broad" - well, it will become hard to explain what is
> not broad in Scala)
Macros + Interpolation make it possible to remove XML while, at the
same time, providing it as a library. And, like XML, one could
likewise provide JSON, etc.
Tue, 2012-02-14, 17:31
#9
Re: scala-macros vs scala-virtualized
On 14.02.2012 16:57, Daniel Sobral wrote:
> Macros + Interpolation make it possible to remove XML while, at the
> same time, providing it as a library. And, like XML, one could
> likewise provide JSON, etc.
Is string interpolation intimately tied with macros though? I thought
(perhaps wrongly) that they were orthogonal features. Would it be
possible to make XML an optional library by using the interpolation
support together with scala-virtualized or whatever other Scala features
aside from macros?
Tue, 2012-02-14, 17:41
#10
Re: scala-macros vs scala-virtualized
Isn't this something that the inline implicit classes proposal would
address (or value classes, whatever they are called now)?
On 14.02.2012 16:44, Erik Osheim wrote:
> Hi Ivan,
>
> On Tue, Feb 14, 2012 at 04:23:18PM +0100, Ivan Todoroski wrote:
>> What *practical* problems can you solve with scala-macros that you
>> couldn't solve just as easily with scala-virtualized?
>
> Like you, I am not an expert at either the work on macros or on
> virtualization. But given that macros are essentially "mini compiler
> plugins" which do tree transformations there are some problems I face
> which I bet they can solve.
>
> The big one is optimizing away intermediate objects which are created
> through use of the implicit enrichment pattern, but which aren't
> needed. I talked about this a bit on the scala-sips mailing list
> recently but here's a summary in code form:
>
> import scala.{specialized => spec}
> import com.azavea.math.Numeric
> import com.azavea.math.FastImplicits._
>
> // code user writes
> def foo1[@spec A:Numeric](x:A, y:A) = x + y
>
> // code with sugar removed
> def foo2[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = x + y
>
> // code after implicit resolution
> def foo3[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = infixOps(x)(ev).+(y)
>
> // code with infixOps method inlined
> def foo4[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = new FastNumericOps(x)(ev).+(y)
>
> // code i wish could get generated, basically inlining the
> // implementation of FastNumericOps#+ without creating a new object.
> def foo5[@spec A](x:A, y:A)(implicit ev:Numeric[A]) = ev.plus(x, y)
>
> This might seem a bit esoteric but this pattern also gets used with
> Ordering and other type classes. It's really the only performance
> problem left with this pattern, after specialization.
>
> There are other cases too (until recently, for-loops) where after
> profiling you can identify certain places where a (small) tree
> transformation will yield huge gains. Virtualization may help in the
> case of loops, but my sense is that it isn't designed to handle this
> general class of problem.
>
Tue, 2012-02-14, 17:51
#11
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 05:24:12PM +0100, Ivan Todoroski wrote:
> Isn't this something that the inline implicit classes proposal would
> address (or value classes, whatever they are called now)?
I'm glad you asked!
After an discussing this on the scala-sip mailing list we determined
that value classes would not support this feature. You may want to
revisit that thread for more information:
http://groups.google.com/group/scala-sips/browse_thread/thread/ad21c133a...
Tue, 2012-02-14, 18:01
#12
Re: scala-macros vs scala-virtualized
On 14.02.2012 16:54, Daniel Sobral wrote:
> I see scala-virtualized as
> generating code at run-time, whereas scala-macros generates code as
> compile-time.
I understand this, but I consider it a mere implementation detail. What
do I care when the code is generated?
Maybe I am approaching this from a different perspective than you. I am
not a compiler developer, I am just a Scala programmer who wants to use
the language productively.
I am interested in things like expressiveness, conciseness and readability.
I am interested in removing boiler plate as much as possible (which is
why I am interested in scala-virtualized and scala-macros).
Finally, I am interested in doing all this in a statically checked
typesafe manner that eliminates runtime errors as much as possible.
As long as I get these features, I don't care how they are accomplished.
Whether through compile-time or run-time code generation, I honestly
don't care as a regular Scala programmer.
What I do care about is the severely increased cognitive load of using
scala-macros. I need to learn a whole new AST sub-language just to
eliminate some boilerplate here and there. Plus all the other things
mentioned in my original email that started this thread.
> Some use cases particular to macros are ad-hoc performance
> optimizations
I believe low-level optimizations are something that the compiler and
libraries should take care of. I still try to write as optimal code as
possible without sacrificing readability too much, but expecting me to
write what amounts to mini compiler plugins to help the compiler with
low-level optimizations is not what I would consider a common use case
for a regular Scala programmer.
The optimization use case on its own is certainly not enough to justify
inclusion of a complex feature like scala-macros. In the rare occasions
where you really needed to optimize some critical bit of code and the
Scala compiler was letting you down, you could also write an actual
Scala compiler plugin, or just write it in C and call it from JNI.
> type-safe evaluation of strings (eg: formatting
> strings, regex, json, and even xml if its literals are ever replaced
> with string interpolation), and generation of code that needs to be
> present as libraries.
This is a very interesting point. I already asked you about the
relationship between interpolation and macros in another sub-thread, I
would appreciate if you or anyone else could shed some light on this.
Tue, 2012-02-14, 18:01
#13
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 14:21, Ivan Todoroski wrote:
> On 14.02.2012 16:57, Daniel Sobral wrote:
>>
>> Macros + Interpolation make it possible to remove XML while, at the
>> same time, providing it as a library. And, like XML, one could
>> likewise provide JSON, etc.
>
>
> Is string interpolation intimately tied with macros though? I thought
> (perhaps wrongly) that they were orthogonal features. Would it be possible
> to make XML an optional library by using the interpolation support together
> with scala-virtualized or whatever other Scala features aside from macros?
No, they are not tied. Well, macros have a dependency on string
interpolation to provide quasi-quotations.
But while string interpolation will give you XML literals and
matching, it won't check syntax at compile time. Using macros for the
interpolator makes that possible.
Tue, 2012-02-14, 18:11
#14
Re: scala-macros vs scala-virtualized
On 14.02.2012 16:58, Paul Brauner wrote:
> I was wondering too but I think the answer is that scala virtualized is
> limited to expressions (and thus does not allow for generating class
> declarations for instance).
Macros can synthesize new class declarations from thin air rather than
just using existing classes you defined elsewhere? Now that does sound
pretty cool, but again I feel like I must insist on putting things in
perspective regarding the practical utility of all this.
What *practical* problem can be solved by synthesizing new class
declarations from a macro, that couldn't be solved using
scala-virtualized in a different yet still satisfactory way?
I am not trying to be contrary, I honestly wish to be educated on the
new possibilities that are opened by scala-macros and scala-virtualized,
and in particular the practical limits on what can be accomplished with
each of those features.
Tue, 2012-02-14, 18:21
#15
Re: scala-macros vs scala-virtualized
The scalamacros.org site has a bunch of desired features for macros:
http://scalamacros.org/usecases/index.html
Most of these can't be achieved with scala-virtualized as it now stands.
--Rex
On Tue, Feb 14, 2012 at 11:51 AM, Ivan Todoroski <grnch_lists@gmx.net> wrote:
http://scalamacros.org/usecases/index.html
Most of these can't be achieved with scala-virtualized as it now stands.
--Rex
On Tue, Feb 14, 2012 at 11:51 AM, Ivan Todoroski <grnch_lists@gmx.net> wrote:
On 14.02.2012 16:58, Paul Brauner wrote:
I was wondering too but I think the answer is that scala virtualized is limited to expressions (and thus does not allow for generating class declarations for instance).
Macros can synthesize new class declarations from thin air rather than just using existing classes you defined elsewhere? Now that does sound pretty cool, but again I feel like I must insist on putting things in perspective regarding the practical utility of all this.
What *practical* problem can be solved by synthesizing new class declarations from a macro, that couldn't be solved using scala-virtualized in a different yet still satisfactory way?
I am not trying to be contrary, I honestly wish to be educated on the new possibilities that are opened by scala-macros and scala-virtualized, and in particular the practical limits on what can be accomplished with each of those features.
Tue, 2012-02-14, 18:31
#16
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 14:42, Ivan Todoroski wrote:
> On 14.02.2012 16:54, Daniel Sobral wrote:
>>
>> I see scala-virtualized as
>> generating code at run-time, whereas scala-macros generates code as
>> compile-time.
>
> I understand this, but I consider it a mere implementation detail. What do I
> care when the code is generated?
Is it? Do you see no difference between a statically typed language
and a dynamically typed language? The only difference is when the type
check is made. Is it an implementation detail?
>> Some use cases particular to macros are ad-hoc performance
>> optimizations
>
> I believe low-level optimizations are something that the compiler and
> libraries should take care of. I still try to write as optimal code as
> possible without sacrificing readability too much, but expecting me to write
> what amounts to mini compiler plugins to help the compiler with low-level
> optimizations is not what I would consider a common use case for a regular
> Scala programmer.
That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
IT with the present support: they can't get rid of instance creation
and boxing. Yes, as a regular Scala programmer you won't be _creating_
macros, but if your for-loops suddenly become as fast as Java's, that
will be a library-provided macro. Neither can the compiler solve it,
because it doesn't have the knowledge that the library has. Or you
think Scala has the performance issues it has simply because no one
tried to fix them?
> The optimization use case on its own is certainly not enough to justify
> inclusion of a complex feature like scala-macros. In the rare occasions
> where you really needed to optimize some critical bit of code and the Scala
> compiler was letting you down, you could also write an actual Scala compiler
> plugin, or just write it in C and call it from JNI.
Talk to Yammer about that. If your "critical bit" means 80% of your
code, what use is it?
Now, compiler plugins suffer from three problems:
1. They are too hard to write.
2. They depend on compiler internals, which are too volatile.
3. You can't distribute them as a JAR.
So, what happens if you handle that? You get a macro. A macro is
really a specialized compiler plugin that doesn't suffer from those
three problems.
C and JNI? Unless your code spends most of its time on C, that won't
help you. The cost of calling JNI is often much higher than the gains
in performance that C will bring. But that's not a solution: you are
throwing away all of Scala's power when you do that. Consider:
Scala-with-macros:
def fatorial(n: Int) = {
var result = 1
for (n <- 2 to n) result *= n
result
}
Scala-without-macros:
def fatorial(n: Int) = {
// declare JNI binding and call C code that computes fatorial
}
That is what you proposed.
>> type-safe evaluation of strings (eg: formatting
>> strings, regex, json, and even xml if its literals are ever replaced
>> with string interpolation), and generation of code that needs to be
>> present as libraries.
>
> This is a very interesting point. I already asked you about the relationship
> between interpolation and macros in another sub-thread, I would appreciate
> if you or anyone else could shed some light on this.
That is pretty simple, and it comes down to that "implementation
detail" you mentioned. Say you have this:
printf("%d: %f%n", a, b)
At run time you can check whether "a" is an Int and "b" is a Double,
and throw an exception if they are not. At compile time you can check
this and produce a compilation error if they are not. The compiler
can't handle this unless it has knowledge of string formats used with
"printf", and, then, it won't be able to handle anything *like* this.
A compiler plugin can handle this, but we are back to the point that a
macro is nothing more than a light-weight plugin that's easier to
create, maintain and distribute.
Of course, Paul wants compiler plugins to be easier to create,
maintain and distribute, and *that* would represent a more direct
competition to macros.
Tue, 2012-02-14, 18:51
#17
Re: scala-macros vs scala-virtualized
One feature macros were supposed to solve is the handling of boiler
plate classes like tuples and functions. Instead of writing Tuple2,
Tuple3, etc. by hand you can generate them with macros.
Basically any abstraction problem you have where the limitation is in
the syntax of your language can be solved by macros. A simple example
that lots of people complain about is Scala's lack of a good optimized
for loop. Scala internal mailing list threads suggest that optimizing
Scala's general for construct is incredibly difficult. With macros,
this becomes a simple syntax problem with a simple solution.
Other good examples come from already-implemented language features,
like case classes. We may see case classes as a wonderful thing, but
Scheme programmers would laugh at us for doing so, as in the end they
are just syntactic sugar that Scheme programmers can and have
implemented as a library (http://docs.racket-lang.org/reference/define-
struct.html).
Now, you might say to yourself, "that's great, but case classes are
already implemented". Well, what if I want to add features to case
classes? What if I want to add a method to each case object that
returns a string version of the name of the object, which I don't
believe is available via reflection? What if, for a given Scala
enumeration, I want to automatically generate a Java version of that
enumeration for interop with Java libraries? What if I just want to
have an improved version of Scala enumerations, which most people are
currently unhappy with and don't use? With macros, all of these things
become feasible.
Once macros are part of your toolkit, you start to realize that lots
of abstraction problems that you previously accepted as a fact of life
suddenly become conquerable. I for one am very excited to see if Scala
can copy some of the success of Nemerle's macro system.
On Feb 14, 10:51 am, Ivan Todoroski wrote:
> On 14.02.2012 16:58, Paul Brauner wrote:
>
> > I was wondering too but I think the answer is that scala virtualized is
> > limited to expressions (and thus does not allow for generating class
> > declarations for instance).
>
> Macros can synthesize new class declarations from thin air rather than
> just using existing classes you defined elsewhere? Now that does sound
> pretty cool, but again I feel like I must insist on putting things in
> perspective regarding the practical utility of all this.
>
> What *practical* problem can be solved by synthesizing new class
> declarations from a macro, that couldn't be solved using
> scala-virtualized in a different yet still satisfactory way?
>
> I am not trying to be contrary, I honestly wish to be educated on the
> new possibilities that are opened by scala-macros and scala-virtualized,
> and in particular the practical limits on what can be accomplished with
> each of those features.
Tue, 2012-02-14, 19:11
#18
Re: Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 15:46, sreque wrote:
>
> Basically any abstraction problem you have where the limitation is in
> the syntax of your language can be solved by macros. A simple example
> that lots of people complain about is Scala's lack of a good optimized
> for loop. Scala internal mailing list threads suggest that optimizing
> Scala's general for construct is incredibly difficult. With macros,
> this becomes a simple syntax problem with a simple solution.
That isn't entirely true. Paul did manage to get Range's foreach to
the same speed as Java's (after warm up, mind you), and the complexity
associated with it can't be ridden of even in macros, except for cases
where the indices are literals or known constants.
Tue, 2012-02-14, 19:21
#19
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 05:42:15PM +0100, Ivan Todoroski wrote:
> I believe low-level optimizations are something that the compiler
> and libraries should take care of. I still try to write as optimal
> code as possible without sacrificing readability too much, but
> expecting me to write what amounts to mini compiler plugins to help
> the compiler with low-level optimizations is not what I would
> consider a common use case for a regular Scala programmer.
Just to echo Daniel here, I am a library author who wants to do
optimizations which my users won't have to see to benefit from. I have
written a compiler plugin but it's a bit fragile and not as advanced as
I 'd like. Also, many users try to avoid compiler plugins.
I certainly like the idea of being able to use macros instead, since
macros will probably be more "future-proof" than compiler plugins
currently are (or at least, have been).
That said, if the compiler API is stable, compiler plugins get more
advanced, and the community is ready to embrace them, then I agree that
maybe macros aren't needed.
Tue, 2012-02-14, 19:31
#20
Re: scala-macros vs scala-virtualized
I suppose part of the difference between generating code at compile
time and at run time is that compile-time code generation can be
statically checked, which is a pretty big thing.
Play! for example has a pretty extensive/complex custom compilation
process to compile your templates into class files, so your
controller->template calls can be verified at compile time, or even
before, during code completion! It also compiles your
LESS/Coffeescript. Being able to treat non-scala code as part of the
program (i.e. statically checked and verified) rather than as data
(dynamically load & pray) would be a pretty big plus.
Being able to statically check more things, e.g:
CSS files
HTML templates
Coffeescript/Javascript
XML Config Files
non-XML config files
Database-schema (a.l.a. F# Type Providers)
Regex-Literals
which are de-facto part of your "program" rather than the "data" would
be pretty awesome (naturally sometimes these things are dynamic and
need to be treated as such, but quite often they are not). One problem
with static languages is that when you integrate with external things
(config files, databases etc.) you lose all static checking and end up
getting lots of silly runtime-errors anyway. Being able to check at
least some of these things at compile-time would be sweet. You can do
all these as compiler plugins, and in a way macros are just a way of
making it easier and more regular.
-Haoyi
On Tue, Feb 14, 2012 at 12:05 PM, Rex Kerr wrote:
> The scalamacros.org site has a bunch of desired features for macros:
> http://scalamacros.org/usecases/index.html
>
> Most of these can't be achieved with scala-virtualized as it now stands.
>
> --Rex
>
>
> On Tue, Feb 14, 2012 at 11:51 AM, Ivan Todoroski
> wrote:
>>
>> On 14.02.2012 16:58, Paul Brauner wrote:
>>>
>>> I was wondering too but I think the answer is that scala virtualized is
>>> limited to expressions (and thus does not allow for generating class
>>> declarations for instance).
>>
>>
>> Macros can synthesize new class declarations from thin air rather than
>> just using existing classes you defined elsewhere? Now that does sound
>> pretty cool, but again I feel like I must insist on putting things in
>> perspective regarding the practical utility of all this.
>>
>> What *practical* problem can be solved by synthesizing new class
>> declarations from a macro, that couldn't be solved using scala-virtualized
>> in a different yet still satisfactory way?
>>
>> I am not trying to be contrary, I honestly wish to be educated on the new
>> possibilities that are opened by scala-macros and scala-virtualized, and in
>> particular the practical limits on what can be accomplished with each of
>> those features.
>
>
Tue, 2012-02-14, 19:41
#21
Re: scala-macros vs scala-virtualized
Looking at the last thread on the subject, I see you were in charge of
making the benchmarking harness Daniel, so you probably know what
you're talking about here. :-)
Still, the final impression I got from the thread at
http://groups.google.com/group/scala-internals/browse_thread/thread/1834...
was that there were still some sticky issues left and your best bet
was to stick with while loops if it really mattered.
Also, I don't know the limitations of Scala's macros, but presumably
with them I could easily right a construct like:
cFor(var i = 0; i < n; i += 1) { body ... }
that would compile down to:
{
var i = 0
while(i < n)
{
body ...
i += 1
}
}
Similarly, you could write for loop constructs that simulate Java's
foreach statement on iterable objects to avoid the allocation of
closures for the loop body. Of course, efficiently implementing a full-
fledged loop with multiple assignments, breaks, and continues is a
much bigger problem, but also hopefully doable. The point here is that
with macros your performance is going to be much more predictable and
less reliant on your JIT, the size of your method bodies, and the
current phase of the moon.
Another example of a use case I had recently had to do with pattern
matching on syntax. Basically, I need two versions of a function: one
that runs really fast and one that prints out helpful error messages
once all possible matches fail. Match failures are common and
acceptable as long as at least one possible match exists, but the
matching code runs in a tight enough loop that using something like
closures to control whether or not helpful error messages are
generated on match failures would still have a significant negative
impact on performance. Macros make it possible to generate both the
optimized version and the diagnostic version of a function from the
same source, and the diagnostic version could then be run when the
optimized version reports an error to find out the exact causes of the
error.
On Feb 14, 12:00 pm, Daniel Sobral wrote:
> On Tue, Feb 14, 2012 at 15:46, sreque wrote:
>
> > Basically any abstraction problem you have where the limitation is in
> > the syntax of your language can be solved by macros. A simple example
> > that lots of people complain about is Scala's lack of a good optimized
> > for loop. Scala internal mailing list threads suggest that optimizing
> > Scala's general for construct is incredibly difficult. With macros,
> > this becomes a simple syntax problem with a simple solution.
>
> That isn't entirely true. Paul did manage to get Range's foreach to
> the same speed as Java's (after warm up, mind you), and the complexity
> associated with it can't be ridden of even in macros, except for cases
> where the indices are literals or known constants.
>
> --
> Daniel C. Sobral
>
> I travel to the future all the time.
Tue, 2012-02-14, 20:31
#22
Re: Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 16:22, sreque wrote:
> Looking at the last thread on the subject, I see you were in charge of
> making the benchmarking harness Daniel, so you probably know what
> you're talking about here. :-)
>
> Still, the final impression I got from the thread at
> http://groups.google.com/group/scala-internals/browse_thread/thread/1834...
> was that there were still some sticky issues left and your best bet
> was to stick with while loops if it really mattered.
No, I haven't got back to it -- there was no Caliper artifact
available last time I tried to -- but my own code was while-loop-fast,
though buggy. Paulp said he also got there, and I believe him.
However, the commit message on the last change to Range indicated he
wasn't done yet. Maybe that's true, maybe not, but I haven't been able
to check how fast that is due to the problem I mentioned.
> Also, I don't know the limitations of Scala's macros, but presumably
> with them I could easily right a construct like:
>
>
> cFor(var i = 0; i < n; i += 1) { body ... }
>
> that would compile down to:
>
> {
> var i = 0
> while(i < n)
> {
> body ...
> i += 1
> }
> }
Well, the start index is constant, and the loop increment is 1, which
makes that code correct. But say, for example, that the increment is
2, and it suddenly stops being correct. Oh, I'll grant that it is
_probably_ correct, but only the programmer can know that, not the
compiler. Specifically, it would fail for n == Int.MaxValue. Now
change the test to <= instead of <, and keep the increment equal to 1,
and you'll get another broken code, for the same value of n.
Range has two problems: one, it doesn't know the step (1 to 5 by 1 --
the step isn't know at the time "1 to 5" is initialized), and, two, it
has to deal with boundary conditions. Macros can help with the former
if the step is known at compile time, which is not always the case
either. There's a LOT of code required to handle all these conditions,
and that's what gets in the way.
JVM can optimize those away, however, which get us back on the same
footing as simple while loops -- once the optimization has kicked in.
> much bigger problem, but also hopefully doable. The point here is that
> with macros your performance is going to be much more predictable and
> less reliant on your JIT, the size of your method bodies, and the
> current phase of the moon.
If you don't trust JIT, you can't rely on your performance. :-) Code
that looks pretty simple and fast can fail to be JITted if you don't
pay attention to it -- which is what happened to Scala's 2.8 and 2.9's
foreach.
Not that I don't agree with you on the general issue of macros and
performance, but it's not that simple either.
Tue, 2012-02-14, 20:51
#23
Re: scala-macros vs scala-virtualized
Hi Daniel,
Thank you for taking the time to respond in depth.
On 14.02.2012 18:22, Daniel Sobral wrote:
>>> I see scala-virtualized as
>>> generating code at run-time, whereas scala-macros generates code as
>>> compile-time.
>> I understand this, but I consider it a mere implementation detail. What do I
>> care when the code is generated?
>
> Is it? Do you see no difference between a statically typed language
> and a dynamically typed language? The only difference is when the type
> check is made. Is it an implementation detail?
I was under the impression that DSLs written using scala-virtualized are
still statically type checked at compile time. Therefore the fact that
the scala-virtualized "generates code at run time" has no bearing on the
type check. The difference between statically and dynamically typed
language is a red herring, it has no bearing on this discussion.
(Of course, scala-virtualized doesn't really generate new code at
runtime, that's why I put it in quotes above, but I guess you were
making an analogy so it's close enough)
>>> Some use cases particular to macros are ad-hoc performance
>>> optimizations
>> I believe low-level optimizations are something that the compiler and
>> libraries should take care of.
>
> That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
> IT with the present support: they can't get rid of instance creation
> and boxing. Yes, as a regular Scala programmer you won't be _creating_
> macros, but if your for-loops suddenly become as fast as Java's, that
> will be a library-provided macro. Neither can the compiler solve it,
> because it doesn't have the knowledge that the library has. Or you
> think Scala has the performance issues it has simply because no one
> tried to fix them?
I don't understand... if a library has the knowledge how to make general
for-loops as fast as Java, why can't the compiler be enhanced to use the
same knowledge?
>> The optimization use case on its own is certainly not enough to justify
>> inclusion of a complex feature like scala-macros. In the rare occasions
>> where you really needed to optimize some critical bit of code and the Scala
>> compiler was letting you down, you could also write an actual Scala compiler
>> plugin, or just write it in C and call it from JNI.
>
> Talk to Yammer about that. If your "critical bit" means 80% of your
> code, what use is it?
So if macros were available to them, the Yammer programmers would have
spent their time writing complicated optimization macros instead of
working on their problem domain?
Writing code optimizers is a difficult task, especially on the JVM where
you have to second guess what the JVM might or might not do with your
code, and what shape of bytecode is most palatable to the JIT.
Don't get me wrong, I see what you are saying here, I understand that
macros can be used to optimize code specific to your problem, but is
that really the main use case for macros?
It boils down to saying "well, the compiler sucks and can't optimize
away instance creation or other bottlenecks I'm running into, so I am
going to use a complicated macro feature to work around compiler's
deficiencies".
It's a valid use, but it somehow seems less satisfying as a rationale
for such a feature.
I would prefer to see features that enhance expressiveness and reduce
boiler plate, yet are still easily accessible to regular day-to-day
programmers without too much cognitive load.
> Now, compiler plugins suffer from three problems:
>
> 1. They are too hard to write.
> 2. They depend on compiler internals, which are too volatile.
> 3. You can't distribute them as a JAR.
>
> So, what happens if you handle that? You get a macro. A macro is
> really a specialized compiler plugin that doesn't suffer from those
> three problems.
If I understand you correctly, you are saying that macros are basically
compiler plugins that are easier to create and distribute. But it seems
they are still too difficult for general programmers to use to reduce
various boilerplate here and there.
Scala-virtualized seems a much lighter weight feature for reducing
boiler plate and creating DSLs by regular programmers, without having to
learn arcane AST trees.
> C and JNI?
Yeah, I went a bit overboard with JNI, sorry about that.
>> This is a very interesting point. I already asked you about the relationship
>> between interpolation and macros in another sub-thread, I would appreciate
>> if you or anyone else could shed some light on this.
>
> That is pretty simple, and it comes down to that "implementation
> detail" you mentioned. Say you have this:
>
> printf("%d: %f%n", a, b)
>
> At run time you can check whether "a" is an Int and "b" is a Double,
> and throw an exception if they are not. At compile time you can check
> this and produce a compilation error if they are not. The compiler
> can't handle this unless it has knowledge of string formats used with
> "printf", and, then, it won't be able to handle anything *like* this.
> A compiler plugin can handle this, but we are back to the point that a
> macro is nothing more than a light-weight plugin that's easier to
> create, maintain and distribute.
But why do you need *macros* for this? This looks more like a job for
pluggable type providers, in combination with string interpolation.
It does come down to an "implementation detail", i.e. whether you
implement something like this with a specific feature designed for it
(pluggable type providers) or you use some general blunt tool (macros)
which seems like overkill.
Tue, 2012-02-14, 21:31
#24
Re: scala-macros vs scala-virtualized
Yes, I am aware of that page, that site is where I learned about
scala-macros in the first place.
The very first use case on that page about Advanced DSLs compares how
macros can achieve much more than Squeryl for example, and they list
limitations of Squeryl such as its inability to use the == operator and
must use === for its syntax (which is expected, since its based on plain
Scala).
Yet they completely ignore the existence of Scala Integrated Query[1]
which is actually built on top of scala-virtualized and solves that
particular problem in Squeryl, along with many more. So that comparison
is misleading. I would love to see a more realistic comparison of SQL
generation by macros vs SIQ.
In the other use cases they mention type providers which are an
orthogonal concept not intrinsically tied to macros (even though their
current implementation might be tightly coupled to macros, I don't know).
Code generation for things like FunctionN and TupleN is certainly
compelling, but I can't help but feel that maybe a language feature for
abstracting over the number of method parameters might be a better
solution to that.
[1]
http://scala-integrated-query.googlecode.com/files/SIQ-Scala-Days-final.pdf
On 14.02.2012 18:05, Rex Kerr wrote:
> The scalamacros.org site has a bunch of desired
> features for macros:
> http://scalamacros.org/usecases/index.html
>
> Most of these can't be achieved with scala-virtualized as it now stands.
>
> --Rex
>
> On Tue, Feb 14, 2012 at 11:51 AM, Ivan Todoroski > wrote:
>
> On 14.02.2012 16:58, Paul Brauner wrote:
>
> I was wondering too but I think the answer is that scala
> virtualized is limited to expressions (and thus does not allow
> for generating class declarations for instance).
>
>
> Macros can synthesize new class declarations from thin air rather
> than just using existing classes you defined elsewhere? Now that
> does sound pretty cool, but again I feel like I must insist on
> putting things in perspective regarding the practical utility of all
> this.
>
> What *practical* problem can be solved by synthesizing new class
> declarations from a macro, that couldn't be solved using
> scala-virtualized in a different yet still satisfactory way?
>
> I am not trying to be contrary, I honestly wish to be educated on
> the new possibilities that are opened by scala-macros and
> scala-virtualized, and in particular the practical limits on what
> can be accomplished with each of those features.
>
>
Tue, 2012-02-14, 21:41
#25
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 17:48, Ivan Todoroski wrote:
>
>> Is it? Do you see no difference between a statically typed language
>> and a dynamically typed language? The only difference is when the type
>> check is made. Is it an implementation detail?
>
>
> I was under the impression that DSLs written using scala-virtualized are
> still statically type checked at compile time. Therefore the fact that the
> scala-virtualized "generates code at run time" has no bearing on the type
> check. The difference between statically and dynamically typed language is a
> red herring, it has no bearing on this discussion.
Then you are under an incorrect assumption. By "generate code" I mean
"compile", with all that it entails, *including* typing.
Here's a simple comparison:
Scala-virtualized (rough example, based on vague memories from long ago):
val s = VString[Regex]("\d+$")
This will type check if Regex is a valid parameter to VString,
indicating it will be able to process the string in question. Also,
"s" will be known as a VString, and will be type-checked through-out.
Scala-macros:
val s = regex("\d+$")
The same guarantees made by Scala-virtualized are valid here, BUT, the
code for "regex" will be run at compile-time, and it will be able not
only to compile the regex itself at compile time, but verify the regex
is indeed valid.
>> That's where "ad-hoc" comes in, really. LIBRARIES CANNOT TAKE CARE OF
>> IT with the present support: they can't get rid of instance creation
>> and boxing. Yes, as a regular Scala programmer you won't be _creating_
>> macros, but if your for-loops suddenly become as fast as Java's, that
>> will be a library-provided macro. Neither can the compiler solve it,
>> because it doesn't have the knowledge that the library has. Or you
>> think Scala has the performance issues it has simply because no one
>> tried to fix them?
>
> I don't understand... if a library has the knowledge how to make general
> for-loops as fast as Java, why can't the compiler be enhanced to use the
> same knowledge?
Does Scala, the compiler, know what Scalaz does? Specs? ScalaTest?
Dispatch? BlueEyes? Anti-xml? The code you'll be writing tomorrow?
It's not that the library knows enough about how to make a fast
for-loop, it is that it knows enough about ITSELF, and that knowledge
would let it perform optimizations the compile can't or shouldn't. Or
even make such optimizations _available_, so that the *user*, who has
way more information about the problem, may choose to do them.
Let's give an example to make this more clear. Let's pick Range's
foreach. One problem with it is that it can't do this:
while(i < n) {
...
i += inc
}
That may wrap-around. However, in most cases "inc" is 1, for which
this idiom, this translation, is ok. How could the compiler possible
know such details about Range's implementation and how it could be
optimized?
>> Talk to Yammer about that. If your "critical bit" means 80% of your
>> code, what use is it?
>
> So if macros were available to them, the Yammer programmers would have spent
> their time writing complicated optimization macros instead of working on
> their problem domain?
Why do you assume macros are complicated? Here's a CFor:
macro def cfor(_this, start, cond, incr)(body) = c"""
${start}
while(${cond}) {
${body}
${incr}
}"""
for(val i = 0, i < 10, i++) {
println(i)
}
Is that complicated? Would it waste Yammer's time? What that would let
is make it possible for Yammer to keep the expressiveness of the
language while enjoying all the speed it could have, at the same time
it avoids having to maintain complex compiler plugins.
Moreover, Yammer might not to have to do that at all, because the
library themselves could take advantage of it, so that Yammer would
not have to go around the libraries. See their comments: they resorted
to Java collections to avoid performance problems with Scala
collections.
> Writing code optimizers is a difficult task, especially on the JVM where you
> have to second guess what the JVM might or might not do with your code, and
> what shape of bytecode is most palatable to the JIT.
Yes, writing code optimizers is a difficult task. Writing optimized
code isn't, which is what macros let you do.
> Don't get me wrong, I see what you are saying here, I understand that macros
> can be used to optimize code specific to your problem, but is that really
> the main use case for macros?
No, it is not the main case. It's one of them. If that was *all*
macros offered, I don't think they'd have much of a chance of getting
in the language. On the other hand, offering the other stuff they do
*and* making it faster, that's a real trick.
> It boils down to saying "well, the compiler sucks and can't optimize away
> instance creation or other bottlenecks I'm running into, so I am going to
> use a complicated macro feature to work around compiler's deficiencies".
It's not that the compiler sucks. JIT is one of the most impressive
piece of technologies out there, it has the benefit of knowing what
happens at run-time, and *it can't do anything about it either*!
Honestly, there isn't much more that Scala knows that JIT doesn't, so
it doesn't have much more opportunity to optimize than JIT. Some,
granted, but not all that much.
Optimizing turning-complete code is hard. It's NP-hard. Ad-hoc
optimization -- optimizing individual cases -- is quite possible, but
you have to special-case them. In fact, JIT does that a lot for Java
Tue, 2012-02-14, 21:51
#26
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 18:30, Ivan Todoroski wrote:
>
> Yet they completely ignore the existence of Scala Integrated Query[1] which
> is actually built on top of scala-virtualized and solves that particular
> problem in Squeryl, along with many more. So that comparison is misleading.
> I would love to see a more realistic comparison of SQL generation by macros
> vs SIQ.
If macros get used (they might), the AST processing will be done at
compile time. Otherwise, it will be done at run-time.
Tue, 2012-02-14, 22:01
#27
Re: Re: scala-macros vs scala-virtualized
Macros and virtualization have complementary strengths and uses:
- Macros are good for local, context-free rewrites on Scala ASTs that are possibly untyped and can be manipulated before type checking, and for generating boilerplate code at compile time which seamlessly integrates with the rest of the program (think type providers).
- Staging and virtualization are good for embedded DSLs that need modularity, sophisticated global analysis and compilation, computation at staging time, specialization to runtime data etc., in particular when the DSL expression trees do not exactly correspond to Scala trees.
The static/dynamic checking aspect is the other way round: Virtualization maintains type safety and relative evaluation order of expressions across the stage boundary, whereas macros allow free composition of untyped trees. Both is useful for different purposes.
Do the benefits of macros outweigh the cost of adding them to the language, in particular given the 'Scala is complex debate'? Nobody knows. The only way to find out is to go ahead and implement them ...
As with any new technology, there will certainly be many 'shiny new hammer, looking for nails' effects, so the challenge will be to provide guidance and empower users to pick the right tool for the job. Implementing an 'advanced' DSLs exclusively using macros is almost certainly not a good idea, as experience in other languages shows. In the end I believe both technologies can benefit each other: macros can help reduce some of the boilerplate that is currently needed to define staged DSLs and virtualization technology can help defining more powerful macros, e.g. infix_ methods from Scala-Virtualized would allow to redirect arbitrary method calls to macros, within a local scope.
Cheers,
- Tiark
On Feb 14, 2012, at 7:22 PM, sreque wrote:
> Looking at the last thread on the subject, I see you were in charge of
> making the benchmarking harness Daniel, so you probably know what
> you're talking about here. :-)
>
> Still, the final impression I got from the thread at
> http://groups.google.com/group/scala-internals/browse_thread/thread/1834...
> was that there were still some sticky issues left and your best bet
> was to stick with while loops if it really mattered.
>
> Also, I don't know the limitations of Scala's macros, but presumably
> with them I could easily right a construct like:
>
>
> cFor(var i = 0; i < n; i += 1) { body ... }
>
> that would compile down to:
>
> {
> var i = 0
> while(i < n)
> {
> body ...
> i += 1
> }
> }
>
> Similarly, you could write for loop constructs that simulate Java's
> foreach statement on iterable objects to avoid the allocation of
> closures for the loop body. Of course, efficiently implementing a full-
> fledged loop with multiple assignments, breaks, and continues is a
> much bigger problem, but also hopefully doable. The point here is that
> with macros your performance is going to be much more predictable and
> less reliant on your JIT, the size of your method bodies, and the
> current phase of the moon.
>
> Another example of a use case I had recently had to do with pattern
> matching on syntax. Basically, I need two versions of a function: one
> that runs really fast and one that prints out helpful error messages
> once all possible matches fail. Match failures are common and
> acceptable as long as at least one possible match exists, but the
> matching code runs in a tight enough loop that using something like
> closures to control whether or not helpful error messages are
> generated on match failures would still have a significant negative
> impact on performance. Macros make it possible to generate both the
> optimized version and the diagnostic version of a function from the
> same source, and the diagnostic version could then be run when the
> optimized version reports an error to find out the exact causes of the
> error.
>
>
> On Feb 14, 12:00 pm, Daniel Sobral wrote:
>> On Tue, Feb 14, 2012 at 15:46, sreque wrote:
>>
>>> Basically any abstraction problem you have where the limitation is in
>>> the syntax of your language can be solved by macros. A simple example
>>> that lots of people complain about is Scala's lack of a good optimized
>>> for loop. Scala internal mailing list threads suggest that optimizing
>>> Scala's general for construct is incredibly difficult. With macros,
>>> this becomes a simple syntax problem with a simple solution.
>>
>> That isn't entirely true. Paul did manage to get Range's foreach to
>> the same speed as Java's (after warm up, mind you), and the complexity
>> associated with it can't be ridden of even in macros, except for cases
>> where the indices are literals or known constants.
>>
>> --
>> Daniel C. Sobral
>>
>> I travel to the future all the time.
Tue, 2012-02-14, 22:11
#28
Re: scala-macros vs scala-virtualized
Some perspective: First, scala-virtualized and Scala macros are both
experimental at the present stage. The discussion on this thread has
already worked out the main points of differences, so I won't go into
that.
Scala-virtualized is a research project, done at EPFL and Stanford. We
hope that some number of features will make it into main Scala.
Adriaan's pattern matcher looks like an excellent first candidate, and
other elements might follow. But there's no a priori intention to
migrate all elements of scala virtualized.
Macros have a shorter term horizon. There will be a SIP soon, and if
things go well we might see them soon in trunk, probably enabled by
some flag.
The intention of macros, and of Scala's language design in general, is
to simplify things. We have already managed to replace code lifting by
macros, and hopefully other features will follow. For instance, there
was a strong push to somehow eliminate the implicit parameter in
atomic { implicit transaction => ... }
and several other related situations. With macros this is trivial.
Without them, it requires a sophisticated dance with scoping rules.
Optimizations such as on Range.foreach are another use case. I believe
that in the long run macros can be a simplifying factor. So, in my
mind, there is enough evidence to try out the idea. But the
implementation is considered experimental at present, and we do
welcome critical discussion around the SIP once it appears.
Cheers
Tue, 2012-02-14, 22:11
#29
Re: scala-macros vs scala-virtualized
A few points:
LMS and scala-virtualized have little to do with macros. Macros are front-end for generating Scala code. LMS is about targetting different backend with *THE SAME* scala code (which could have been generated by a macro).
On Tue, Feb 14, 2012 at 3:44 PM, Daniel Sobral <dcsobral@gmail.com> wrote:
- Scala virtualized is not about code generation, necessarily. Scala-virtualized is about *virtualizing* concepts in the scala compiler so intermediate representation (or ASTs) can be targeted at different platforms.
- *Lightweight modular staging* is the ability to "stage" scala code and adapt it to a different backend at (potentially) runtime. *THIS* is what I think you're confusing with scala-virtualized.
- Scala-virtualized and Macros do not conflict at all. Macros are used to generate ASTs. Macros *may* conflict with compiler plugins, since they allow you to do similar work as a compiler plugin but inside your scala code rather than as a compiler add-on. Scala Macros should work well with Scala-virtualized.
- Scala Macros are doing a bit of good to simplify the manipulation of the type-checker and the AST. Hopefully, if nothing else, scala macros will make the compiler and compiler plugins more approachable for everyone.
LMS and scala-virtualized have little to do with macros. Macros are front-end for generating Scala code. LMS is about targetting different backend with *THE SAME* scala code (which could have been generated by a macro).
On Tue, Feb 14, 2012 at 3:44 PM, Daniel Sobral <dcsobral@gmail.com> wrote:
On Tue, Feb 14, 2012 at 18:30, Ivan Todoroski <grnch_lists@gmx.net> wrote:
>
> Yet they completely ignore the existence of Scala Integrated Query[1] which
> is actually built on top of scala-virtualized and solves that particular
> problem in Squeryl, along with many more. So that comparison is misleading.
> I would love to see a more realistic comparison of SQL generation by macros
> vs SIQ.
If macros get used (they might), the AST processing will be done at
compile time. Otherwise, it will be done at run-time.
--
Daniel C. Sobral
I travel to the future all the time.
Tue, 2012-02-14, 22:21
#30
Re: scala-macros vs scala-virtualized
On Tuesday, 14. February 2012 at 19:09, Erik Osheim wrote:
That said, if the compiler API is stable, compiler plugins get moreadvanced, and the community is ready to embrace them, then I agree thatmaybe macros aren't needed.
An advantage of macros is that they feel much lighter than compiler plugins.
The barrier to include a new compiler plugin into Scala releases is higher thanwriting some library function as a macro def: for instance, I never heard of plansto write a compiler-plugin to optimize Range.foreach and ship it with the Scalarelease. Also, enabling a plugin needs changes to the build configuration, butpeople can use a macro without even knowing it.
Tue, 2012-02-14, 22:31
#31
Re: scala-macros vs scala-virtualized
On Tue, Feb 14, 2012 at 10:00:33PM +0100, Lukas Rytz wrote:
> An advantage of macros is that they feel much lighter than compiler plugins.
>
> The barrier to include a new compiler plugin into Scala releases is higher than
> writing some library function as a macro def: for instance, I never heard of plans
> to write a compiler-plugin to optimize Range.foreach and ship it with the Scala
> release. Also, enabling a plugin needs changes to the build configuration, but
> people can use a macro without even knowing it.
I totally agree! I was just trying to be totally explicit (in the
context of Ivan's skepticism that macros are needed) about what I felt
like they added over compiler plugins, and why I feel like I need them.
Wed, 2012-02-15, 01:21
#32
Re: scala-macros vs scala-virtualized
It seems to me it's not entirely true that staging and macros don't overlap. For instance, you can achieve in template haskell (macros) what can be done in meta ocaml (staging). Similarity you can achieve with lisp/scheme macros what can be done with quote/unquote/eval.
But I agree that this discussion overlooks the fact that multi staged programming is just a special case of scala virtualized (if I understand correctly, the LMS paper is a special application of scala virtualized).
On Feb 15, 2012 4:02 AM, "Josh Suereth" <joshua.suereth@gmail.com> wrote:A few points:So basically, I'm not really sure what points you're trying to make here... Macros solve a need that *could* be solved by compiler plugins (see dick wall's subcut plugin for examples of where macros could have been used instead of a plugin).
- Scala virtualized is not about code generation, necessarily. Scala-virtualized is about *virtualizing* concepts in the scala compiler so intermediate representation (or ASTs) can be targeted at different platforms.
- *Lightweight modular staging* is the ability to "stage" scala code and adapt it to a different backend at (potentially) runtime. *THIS* is what I think you're confusing with scala-virtualized.
- Scala-virtualized and Macros do not conflict at all. Macros are used to generate ASTs. Macros *may* conflict with compiler plugins, since they allow you to do similar work as a compiler plugin but inside your scala code rather than as a compiler add-on. Scala Macros should work well with Scala-virtualized.
- Scala Macros are doing a bit of good to simplify the manipulation of the type-checker and the AST. Hopefully, if nothing else, scala macros will make the compiler and compiler plugins more approachable for everyone.
LMS and scala-virtualized have little to do with macros. Macros are front-end for generating Scala code. LMS is about targetting different backend with *THE SAME* scala code (which could have been generated by a macro).
On Tue, Feb 14, 2012 at 3:44 PM, Daniel Sobral <dcsobral@gmail.com> wrote:
On Tue, Feb 14, 2012 at 18:30, Ivan Todoroski <grnch_lists@gmx.net> wrote:
>
> Yet they completely ignore the existence of Scala Integrated Query[1] which
> is actually built on top of scala-virtualized and solves that particular
> problem in Squeryl, along with many more. So that comparison is misleading.
> I would love to see a more realistic comparison of SQL generation by macros
> vs SIQ.
If macros get used (they might), the AST processing will be done at
compile time. Otherwise, it will be done at run-time.
--
Daniel C. Sobral
I travel to the future all the time.
Wed, 2012-02-15, 04:01
#33
Re: scala-macros vs scala-virtualized
s/Similarity/Similarly/
Stupid phone.
Stupid phone.
Wed, 2012-02-15, 21:11
#34
Re: scala-macros vs scala-virtualized
Hi Martin,
Who do you see as the target audience for macros?
a) regular rank-and-file developers looking to reduce repeated boiler
plate and create simple DSLs specific to their projects
b) hardcore library designers who in the absence of macros would have
resorted to compiler plugins for their advanced tricks, but will now use
macros for that purpose
If it's A, what is your view on the problems raised in the first 3
points of the original email that started this thread, namely having to
learn an arcane AST sub-language to create macros and the difficulty of
debugging macros?
This would seem to raise the threshold on the amount of boilerplate and
duplication a regular developer would tolerate before finally sitting
down to learn the vagaries of macros to reduce it.
If it's B, then what should the busy regular developer use for reducing
boiler plate and creating simple DSLs for their work, if macros remain
generally difficult for them to use?
I do appreciate your response regarding plans for inclusion of
scala-macros vs scala-virtualized, it clarifies the direction of Scala
somewhat.
On 14.02.2012 21:56, martin odersky wrote:
> Some perspective: First, scala-virtualized and Scala macros are both
> experimental at the present stage. The discussion on this thread has
> already worked out the main points of differences, so I won't go into
> that.
>
> Scala-virtualized is a research project, done at EPFL and Stanford. We
> hope that some number of features will make it into main Scala.
> Adriaan's pattern matcher looks like an excellent first candidate, and
> other elements might follow. But there's no a priori intention to
> migrate all elements of scala virtualized.
>
> Macros have a shorter term horizon. There will be a SIP soon, and if
> things go well we might see them soon in trunk, probably enabled by
> some flag.
>
> The intention of macros, and of Scala's language design in general, is
> to simplify things. We have already managed to replace code lifting by
> macros, and hopefully other features will follow. For instance, there
> was a strong push to somehow eliminate the implicit parameter in
>
> atomic { implicit transaction => ... }
>
> and several other related situations. With macros this is trivial.
> Without them, it requires a sophisticated dance with scoping rules.
> Optimizations such as on Range.foreach are another use case. I believe
> that in the long run macros can be a simplifying factor. So, in my
> mind, there is enough evidence to try out the idea. But the
> implementation is considered experimental at present, and we do
> welcome critical discussion around the SIP once it appears.
>
> Cheers
>
Wed, 2012-02-15, 21:31
#35
Re: scala-macros vs scala-virtualized
On Wed, Feb 15, 2012 at 9:00 PM, Ivan Todoroski wrote:
> Hi Martin,
>
> Who do you see as the target audience for macros?
>
> a) regular rank-and-file developers looking to reduce repeated boiler plate
> and create simple DSLs specific to their projects
>
> b) hardcore library designers who in the absence of macros would have
> resorted to compiler plugins for their advanced tricks, but will now use
> macros for that purpose
>
>
> If it's A, what is your view on the problems raised in the first 3 points of
> the original email that started this thread, namely having to learn an
> arcane AST sub-language to create macros and the difficulty of debugging
> macros?
>
> This would seem to raise the threshold on the amount of boilerplate and
> duplication a regular developer would tolerate before finally sitting down
> to learn the vagaries of macros to reduce it.
>
>
> If it's B, then what should the busy regular developer use for reducing
> boiler plate and creating simple DSLs for their work, if macros remain
> generally difficult for them to use?
>
I think it's rather B. And, Scala already has a lot of mechanisms that
reduce boilerplate even without resorting to macros and scala
virtualized. So I am not at all concerned that the bar for using
either is high. In fact, I'd prefer it that way.
I see macros as a useful middle road. Every language designer is faced
with a constant stream of suggestions to enlarge the language. And a
lot of these make sense. But following them would make the language
bigger and that's something I am very reluctant to do. If at all
possible, I'd like to make it smaller! Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
It's a conundrum. Scala's philosophy is to make things possible, and
not erect walls against misuse. Trust the competency of your
programmers. But posts like this one make me realize the danger of
doing this:
http://yz.mit.edu/wp/true-scala-complexity/
It seems that everyone will at some point in their development as a
developer reach "peak abstraction",
http://lambda-the-ultimate.org/node/4442
and Scala's problem is that peak abstraction is so much higher (and
therefore more scary) than in other languages. The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?
Cheers
Wed, 2012-02-15, 21:41
#36
Re: scala-macros vs scala-virtualized
On 14.02.2012 21:34, Daniel Sobral wrote:
> Why do you assume macros are complicated? Here's a CFor:
>
> macro def cfor(_this, start, cond, incr)(body) = c"""
> ${start}
> while(${cond}) {
> ${body}
> ${incr}
> }"""
Now that's interesting. At the time when I was researching scala-macros,
all the examples I could find were very similar to this one from the
Macros SIP:
class Queryable[T, Repr](query: Query) {
macro def filter(p: T => Boolean)(implicit ctx: CompilerContext):
Repr = {
val ast = Block()
val b = ctx.gensym(b, classOf[QueryableBuilder])
ast += Assign(b, Call(newBuilder, this))
ast += Call(query=, b, Create(classOf[Filter], Call(query, this),
reify(p)))
ast += Call(result, b)
ast
}
}
It even praises what a nice example of macros this is in the very next
sentence, which I barely read because my eyes were already bleeding. :)
Now I see in the Quasiquotations SIP this macro being rewritten as:
class Queryable[T, Repr](query: Query) {
macro def filter(p: T => Boolean): Repr = c
val b = $this.newBuilder
b.query = Filter($this.query, $reify(p))
b.result
}
This certainly looks much nicer, but I still don't really understand it
fully and the rest of the SIPs are not much help there.
Maybe it's just a matter of lack of documentation and communication
about what these shiny new things are and how to use them. Not so much
how they are implemented and how cool they are, but more about how a
busy developer who is not a Scala wizard can use them to solve practical
problems like refactoring, reducing boiler plate, and raising the
general level of abstractions.
On the other hand, I didn't have nearly as much trouble understanding
and following the scala-virtualized stuff. It felt easier and more
natural for someone who already understood existing Scala functionality
like implicits, pimping and operator overloading. It felt like a logical
extension of those concepts.
You can certainly dismiss me as not trying hard enough, but I am a Java
developer who really *really* likes the elegance of Scala and is trying
his best to learn about it in his very limited free time, and is even
already trying to use it in non-critical projects to see how it goes. I
think many developers are in a similar situation.
And for the record, I disagree 100% with the "Scala is too complex"
camp. I am not whining about complexity here, I am seriously trying to
learn how to use Scala to its fullest extent, but the cognitive load of
using advanced features is a very real factor in determining how much
boiler plate and code duplication one would tolerate before reaching for
an advanced Scala feature to eliminate it.
Wed, 2012-02-15, 21:51
#37
Re: scala-macros vs scala-virtualized
On 15.02.2012 21:25, martin odersky wrote:
> Scala already has a lot of mechanisms that
> reduce boilerplate even without resorting to macros and scala
> virtualized.
Oh absolutely! That is in fact the single biggest reason that drew me to
Scala. The level of abstraction achievable even with the current stable
version of Scala is astounding.
However there are still some warts here and there, such as the inability
to use things like == in DSLs and having to come up with ===, which I
hoped to be able to get around using something like scala-virtualized
that's relatively easy to use.
Still, I appreciate the difficulty faced by language designers and the
various trade offs that have to be made.
Wed, 2012-02-15, 22:01
#38
Re: scala-macros vs scala-virtualized
On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
def opOverElement(meth: String, e: Entity, op: (String,String) => String) = {
if (isScalar && e.isScalar) None
else onlyIf (
meth!=":" && e.isScalar && r==GenericType && e.r==GenericType &&
((t^e.t)==e.t || (t==FloatType && (e.t==IntType || e.t==LongType)))
) {
"def "+meth+"("+e.paramary+") = " +
Entity(n, m, ^(e), %(e)).create((i,j) => {
Expr( op(\\(i,j,^(e)).text, e.\\(1,1,^(e),_+"0").text
}) , ^(e) ).mathed)
}
}
which doesn't exactly strike me as "not overboard", even though I wrote it to be as clear to me as possible. With code generators I find myself in a constant three-way battle between flexibility, clarity, and compactness, and usually lose on at least one if not two counts. (The above is an example of loss of clarity; the reason I sacrifice clarity for compactness is that when I need the generated code to do something nontrivial, it really helps if the generator is not spread out over several screens so I can actually see what's being produced in addition to all the machinery to produce it.)
Macros would provide a great fourth option even if they're not as general as an arbitrary code generator, since anything that can lower "repeat yourself a gazillion times" to a more moderate/manageable number will help avoid resorting to 1. or 3..
Great examples that show off how to use it for clarity and performance (instead of wow-look-what-I-can-do) would be a start--sometimes one can fix these things culturally.
--Rex
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
def opOverElement(meth: String, e: Entity, op: (String,String) => String) = {
if (isScalar && e.isScalar) None
else onlyIf (
meth!=":" && e.isScalar && r==GenericType && e.r==GenericType &&
((t^e.t)==e.t || (t==FloatType && (e.t==IntType || e.t==LongType)))
) {
"def "+meth+"("+e.paramary+") = " +
Entity(n, m, ^(e), %(e)).create((i,j) => {
Expr( op(\\(i,j,^(e)).text, e.\\(1,1,^(e),_+"0").text
}) , ^(e) ).mathed)
}
}
which doesn't exactly strike me as "not overboard", even though I wrote it to be as clear to me as possible. With code generators I find myself in a constant three-way battle between flexibility, clarity, and compactness, and usually lose on at least one if not two counts. (The above is an example of loss of clarity; the reason I sacrifice clarity for compactness is that when I need the generated code to do something nontrivial, it really helps if the generator is not spread out over several screens so I can actually see what's being produced in addition to all the machinery to produce it.)
Macros would provide a great fourth option even if they're not as general as an arbitrary code generator, since anything that can lower "repeat yourself a gazillion times" to a more moderate/manageable number will help avoid resorting to 1. or 3..
The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?
Great examples that show off how to use it for clarity and performance (instead of wow-look-what-I-can-do) would be a start--sometimes one can fix these things culturally.
--Rex
Wed, 2012-02-15, 22:21
#39
Re: scala-macros vs scala-virtualized
Hi,
in my opinion the fear that people abuse macros would be substantially reduced if it ships with some real world libraries showing _when_ and _how_ to use them idiomatically.
The most unfortunate thing to do would be shipping it as a feature in search of a problem. This was imho the case with scala.Dynamic. Dynamic appeared with some interesting prototypes of "possible" things, but I haven't seen a single reasonable "real world" usage of it yet.
In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")
The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.
Thanks and bye,
Simon
in my opinion the fear that people abuse macros would be substantially reduced if it ships with some real world libraries showing _when_ and _how_ to use them idiomatically.
The most unfortunate thing to do would be shipping it as a feature in search of a problem. This was imho the case with scala.Dynamic. Dynamic appeared with some interesting prototypes of "possible" things, but I haven't seen a single reasonable "real world" usage of it yet.
In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")
The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.
Thanks and bye,
Simon
Wed, 2012-02-15, 23:01
#40
Re: scala-macros vs scala-virtualized
On Wed, Feb 15, 2012 at 4:16 PM, Simon Ochsenreither <simon.ochsenreither@googlemail.com> wrote:
cfor and avoiding code generators for TupleN/FunctionN are obvious choices.
I'd happy contribute a Muple library if anyone was interested. (Muples are mutable and specialized tuples, which I use in my code to speed certain types of fold by ~5x without losing expressive power or sacrificing safety *if* used properly. I'm pretty sure that with macros I can not even expose the mutable side of them and just have a high-performance fold that yields a tuple in the end...but I guess it will depend on the exact details of the implementation.)
One could call it something else, though I don't know that any of the alternatives are less scary marketing-wise.
--Rex
In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")
cfor and avoiding code generators for TupleN/FunctionN are obvious choices.
I'd happy contribute a Muple library if anyone was interested. (Muples are mutable and specialized tuples, which I use in my code to speed certain types of fold by ~5x without losing expressive power or sacrificing safety *if* used properly. I'm pretty sure that with macros I can not even expose the mutable side of them and just have a high-performance fold that yields a tuple in the end...but I guess it will depend on the exact details of the implementation.)
The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.
One could call it something else, though I don't know that any of the alternatives are less scary marketing-wise.
--Rex
Wed, 2012-02-15, 23:11
#41
Re: scala-macros vs scala-virtualized
On Wed, Feb 15, 2012 at 10:51 PM, Rex Kerr <ichoran@gmail.com> wrote:
On Wed, Feb 15, 2012 at 4:16 PM, Simon Ochsenreither <simon.ochsenreither@googlemail.com> wrote:
In the end I think it would be more reasonable to go ahead and release 2.10 without macros (or behind a compiler switch like it currently is) and officially ship macros when they are ready. (Ready == "there is a real library solving a real problem using it, which can be recommended to developers without having to worry")
cfor and avoiding code generators for TupleN/FunctionN are obvious choices.
I'd happy contribute a Muple library if anyone was interested. (Muples are mutable and specialized tuples, which I use in my code to speed certain types of fold by ~5x without losing expressive power or sacrificing safety *if* used properly. I'm pretty sure that with macros I can not even expose the mutable side of them and just have a high-performance fold that yields a tuple in the end...but I guess it will depend on the exact details of the implementation.)
The name "macro" is a 100% guarantee for a marketing nightmare, so it shouldn't be made worse by shipping it without any substantial example which can be explained to developers.
One could call it something else, though I don't know that any of the alternatives are less scary marketing-wise.
Higher-order sourcecode?
--Rex
--
Viktor Klang
Akka Tech LeadTypesafe - The software stack for applications that scale
Twitter: @viktorklang
Thu, 2012-02-16, 12:11
#42
Re: scala-macros vs scala-virtualized
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:
On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
It's 2012. You can write code generators like this:
val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) } val densities = DenseVector[Int](data.numRows, true)
for (row <- distances.rows) { if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth } densities(neighbors) = row count { _ < kernelWidth }
} }
and emit Scala code that runs as fast as C [1]
- Tiark
[1] http://ppl.stanford.edu/papers/dsl11-rompf.pdf
def opOverElement(meth: String, e: Entity, op: (String,String) => String) = {
if (isScalar && e.isScalar) None
else onlyIf (
meth!=":" && e.isScalar && r==GenericType && e.r==GenericType &&
((t^e.t)==e.t || (t==FloatType && (e.t==IntType || e.t==LongType)))
) {
"def "+meth+"("+e.paramary+") = " +
Entity(n, m, ^(e), %(e)).create((i,j) => {
Expr( op(\\(i,j,^(e)).text, e.\\(1,1,^(e),_+"0").text
}) , ^(e) ).mathed)
}
}
which doesn't exactly strike me as "not overboard", even though I wrote it to be as clear to me as possible. With code generators I find myself in a constant three-way battle between flexibility, clarity, and compactness, and usually lose on at least one if not two counts. (The above is an example of loss of clarity; the reason I sacrifice clarity for compactness is that when I need the generated code to do something nontrivial, it really helps if the generator is not spread out over several screens so I can actually see what's being produced in addition to all the machinery to produce it.)
Macros would provide a great fourth option even if they're not as general as an arbitrary code generator, since anything that can lower "repeat yourself a gazillion times" to a more moderate/manageable number will help avoid resorting to 1. or 3..
The question I am
grappling with is, how can we get the advantages of macros without the
potential for misuse?
Great examples that show off how to use it for clarity and performance (instead of wow-look-what-I-can-do) would be a start--sometimes one can fix these things culturally.
--Rex
Thu, 2012-02-16, 16:51
#43
Re: scala-macros vs scala-virtualized
On Thu, Feb 16, 2012 at 6:04 AM, Tiark Rompf <tiark.rompf@epfl.ch> wrote:
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
It's 2012. You can write code generators like this:
val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) } val densities = DenseVector[Int](data.numRows, true)
for (row <- distances.rows) { if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth } densities(neighbors) = row count { _ < kernelWidth }
} }
and emit Scala code that runs as fast as C [1]
- Tiark
That is very cool (impressive performance in your test case with OptiML, though the C++ doesn't look *quite* optimally tuned to me (e.g. it allocates memory every outer iteration instead of just once per thread in order to make the parallelization easier)), but doesn't abstract over number of arguments and such, which is almost invariably why I end up writing code generators.
For example, the opOverElement mess that I wrote generates element-wise binary operations distributed over matrices or vectors of fixed size while handling the type conversions in a high-performance but sensible way.
Anyway, I would certainly not turn down the performance advantages of virtualization if they were available in the standard Scala distribution, but that alone wouldn't solve the 1-3 problem (though it might make the 30x closer to 5x if it meant that I could specialize collections).
--Rex
Fri, 2012-02-17, 14:41
#44
Re: scala-macros vs scala-virtualized
I don't really see the issue with having 2 or more projects that tackle a similar problem or set of problems. More projects equals more options and that's a good thing in my opinion.
On 16 February 2012 17:43, Rex Kerr <ichoran@gmail.com> wrote:
On 16 February 2012 17:43, Rex Kerr <ichoran@gmail.com> wrote:
On Thu, Feb 16, 2012 at 6:04 AM, Tiark Rompf <tiark.rompf@epfl.ch> wrote:
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
It's 2012. You can write code generators like this:
val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) } val densities = DenseVector[Int](data.numRows, true)
for (row <- distances.rows) { if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth } densities(neighbors) = row count { _ < kernelWidth }
} }
and emit Scala code that runs as fast as C [1]
- Tiark
That is very cool (impressive performance in your test case with OptiML, though the C++ doesn't look *quite* optimally tuned to me (e.g. it allocates memory every outer iteration instead of just once per thread in order to make the parallelization easier)), but doesn't abstract over number of arguments and such, which is almost invariably why I end up writing code generators.
For example, the opOverElement mess that I wrote generates element-wise binary operations distributed over matrices or vectors of fixed size while handling the type conversions in a high-performance but sensible way.
Anyway, I would certainly not turn down the performance advantages of virtualization if they were available in the standard Scala distribution, but that alone wouldn't solve the 1-3 problem (though it might make the 30x closer to 5x if it meant that I could specialize collections).
--Rex
Fri, 2012-02-17, 15:31
#45
Re: scala-macros vs scala-virtualized
2012/2/17 Adam Jorgensen <adam.jorgensen.za@gmail.com>
I don't really see the issue with having 2 or more projects that tackle a similar problem or set of problems. More projects equals more options and that's a good thing in my opinion.But most probably you don't want both the features on at same time when using language... At least not before there are proofs they won't screw each other up.
On 16 February 2012 17:43, Rex Kerr <ichoran@gmail.com> wrote:
On Thu, Feb 16, 2012 at 6:04 AM, Tiark Rompf <tiark.rompf@epfl.ch> wrote:
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
It's 2012. You can write code generators like this:
val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) } val densities = DenseVector[Int](data.numRows, true)
for (row <- distances.rows) { if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth } densities(neighbors) = row count { _ < kernelWidth }
} }
and emit Scala code that runs as fast as C [1]
- Tiark
That is very cool (impressive performance in your test case with OptiML, though the C++ doesn't look *quite* optimally tuned to me (e.g. it allocates memory every outer iteration instead of just once per thread in order to make the parallelization easier)), but doesn't abstract over number of arguments and such, which is almost invariably why I end up writing code generators.
For example, the opOverElement mess that I wrote generates element-wise binary operations distributed over matrices or vectors of fixed size while handling the type conversions in a high-performance but sensible way.
Anyway, I would certainly not turn down the performance advantages of virtualization if they were available in the standard Scala distribution, but that alone wouldn't solve the 1-3 problem (though it might make the 30x closer to 5x if it meant that I could specialize collections).
--Rex
Sat, 2012-02-18, 19:21
#46
Re: scala-macros vs scala-virtualized
I suppose. I would like the freedom to shoot myself in the foot if I so choose though :-)
Reminds me of the name of a C++ book I once saw: "C++: Enough rope to shoot yourself in the foot". Still makes me laugh :-)
On 17 February 2012 16:24, Alex Repain <alex.repain@gmail.com> wrote:
Reminds me of the name of a C++ book I once saw: "C++: Enough rope to shoot yourself in the foot". Still makes me laugh :-)
On 17 February 2012 16:24, Alex Repain <alex.repain@gmail.com> wrote:
2012/2/17 Adam Jorgensen <adam.jorgensen.za@gmail.com>I don't really see the issue with having 2 or more projects that tackle a similar problem or set of problems. More projects equals more options and that's a good thing in my opinion.But most probably you don't want both the features on at same time when using language... At least not before there are proofs they won't screw each other up.
On 16 February 2012 17:43, Rex Kerr <ichoran@gmail.com> wrote:
On Thu, Feb 16, 2012 at 6:04 AM, Tiark Rompf <tiark.rompf@epfl.ch> wrote:
On Feb 15, 2012, at 9:57 PM, Rex Kerr wrote:On Wed, Feb 15, 2012 at 3:25 PM, martin odersky <martin.odersky@epfl.ch> wrote:
Macros are a more stable and
systematic way to achieve more convenient syntax than compiler
plugins. That's why I think they are promising. The thing that scares
me deeply is that some people will inevitably go overboard with them
and misuse them.
Code generators are even more scary, though, but right now the options are sometimes
1. Have code that runs 30x slower than C++
2. Repeat yourself a gazillion times
3. Write a code generator
But picking (3) leaves one with generator code like
It's 2012. You can write code generators like this:
val distances = Stream[Double](data.numRows, data.numRows){ (i,j) => dist(data(i),data(j)) } val densities = DenseVector[Int](data.numRows, true)
for (row <- distances.rows) { if(densities(row.index) == 0) {
val neighbors = row find { _ < apprxWidth } densities(neighbors) = row count { _ < kernelWidth }
} }
and emit Scala code that runs as fast as C [1]
- Tiark
That is very cool (impressive performance in your test case with OptiML, though the C++ doesn't look *quite* optimally tuned to me (e.g. it allocates memory every outer iteration instead of just once per thread in order to make the parallelization easier)), but doesn't abstract over number of arguments and such, which is almost invariably why I end up writing code generators.
For example, the opOverElement mess that I wrote generates element-wise binary operations distributed over matrices or vectors of fixed size while handling the type conversions in a high-performance but sensible way.
Anyway, I would certainly not turn down the performance advantages of virtualization if they were available in the standard Scala distribution, but that alone wouldn't solve the 1-3 problem (though it might make the 30x closer to 5x if it meant that I could specialize collections).
--Rex
Sat, 2012-02-18, 22:21
#47
Re: scala-macros vs scala-virtualized
p, li { white-space: pre-wrap; }
Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:
> The name "macro" is a 100% guarantee for a marketing nightmare, so it
> shouldn't be made worse by shipping it without any substantial example
> which can be explained to developers.
I can already imagine the headlines:
"after the longest suicide note in history scala has finally managed to fully erupt in insanity by adding #MACROS"
Well... to be honest - the first thing I thought when I heard "#MACROS" was "Oh dear - they won't do that - will they?". Now I know that we are talking about some substantially different things than those macros I was thinking about (flashback in the 80's... shudder).
But to me "macro" surely has a real negative bias. And I think I am not alone. So maybe its really worth thinking about names here. It won't be the first time in history where something is shipped with a label that's not as clear and technical as it could be (but most often for good reason - marketing is also a part of a product).
Greetings
Bernd
Sat, 2012-02-18, 23:11
#48
RE: scala-macros vs scala-virtualized
I think “macro” is fine, personally. Yes there is some bad connotations associated with the word in some communities, but then there’s Lisp (Scheme, Clojure) and other languages that use the word to mean something quite different than what it means to C/C++ people. We would just be joining the ranks of those other languages. No problem.
Peter
From: scala-debate@googlegroups.com [mailto:scala-debate@googlegroups.com] On Behalf Of Bernd Johannes
Sent: Saturday, February 18, 2012 16:16
To: scala-debate@googlegroups.com
Cc: Simon Ochsenreither
Subject: Re: [scala-debate] scala-macros vs scala-virtualized
Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:
> The name "macro" is a 100% guarantee for a marketing nightmare, so it
> shouldn't be made worse by shipping it without any substantial example
> which can be explained to developers.
I can already imagine the headlines:
"after the longest suicide note in history scala has finally managed to fully erupt in insanity by adding #MACROS"
Well... to be honest - the first thing I thought when I heard "#MACROS" was "Oh dear - they won't do that - will they?". Now I know that we are talking about some substantially different things than those macros I was thinking about (flashback in the 80's... shudder).
But to me "macro" surely has a real negative bias. And I think I am not alone. So maybe its really worth thinking about names here. It won't be the first time in history where something is shipped with a label that's not as clear and technical as it could be (but most often for good reason - marketing is also a part of a product).
Greetings
Bernd
Sat, 2012-02-18, 23:21
#49
Re: scala-macros vs scala-virtualized
"source transformers"?
On Sat, Feb 18, 2012 at 11:09 PM, Chapin, Peter @ VTC <PChapin@vtc.vsc.edu> wrote:
--
Viktor Klang
Akka Tech LeadTypesafe - The software stack for applications that scale
Twitter: @viktorklang
On Sat, Feb 18, 2012 at 11:09 PM, Chapin, Peter @ VTC <PChapin@vtc.vsc.edu> wrote:
I think “macro” is fine, personally. Yes there is some bad connotations associated with the word in some communities, but then there’s Lisp (Scheme, Clojure) and other languages that use the word to mean something quite different than what it means to C/C++ people. We would just be joining the ranks of those other languages. No problem.
Peter
From: scala-debate@googlegroups.com [mailto:scala-debate@googlegroups.com] On Behalf Of Bernd Johannes
Sent: Saturday, February 18, 2012 16:16
To: scala-debate@googlegroups.com
Cc: Simon Ochsenreither
Subject: Re: [scala-debate] scala-macros vs scala-virtualized
Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:
> The name "macro" is a 100% guarantee for a marketing nightmare, so it
> shouldn't be made worse by shipping it without any substantial example
> which can be explained to developers.
I can already imagine the headlines:
"after the longest suicide note in history scala has finally managed to fully erupt in insanity by adding #MACROS"
Well... to be honest - the first thing I thought when I heard "#MACROS" was "Oh dear - they won't do that - will they?". Now I know that we are talking about some substantially different things than those macros I was thinking about (flashback in the 80's... shudder).
But to me "macro" surely has a real negative bias. And I think I am not alone. So maybe its really worth thinking about names here. It won't be the first time in history where something is shipped with a label that's not as clear and technical as it could be (but most often for good reason - marketing is also a part of a product).
Greetings
Bernd
--
Viktor Klang
Akka Tech LeadTypesafe - The software stack for applications that scale
Twitter: @viktorklang
Sun, 2012-02-19, 01:01
#50
Re: scala-macros vs scala-virtualized
Don't bend over trying to appease people who will judge you based on
their misconceptions. If they decide to misjudge it, THEY will call it
macros, no matter name we chose.
2012/2/18 √iktor Ҡlang :
> "source transformers"?
>
>
> On Sat, Feb 18, 2012 at 11:09 PM, Chapin, Peter @ VTC
> wrote:
>>
>> I think “macro” is fine, personally. Yes there is some bad connotations
>> associated with the word in some communities, but then there’s Lisp (Scheme,
>> Clojure) and other languages that use the word to mean something quite
>> different than what it means to C/C++ people. We would just be joining the
>> ranks of those other languages. No problem.
>>
>>
>>
>> Peter
>>
>>
>>
>> From: scala-debate@googlegroups.com [mailto:scala-debate@googlegroups.com]
>> On Behalf Of Bernd Johannes
>> Sent: Saturday, February 18, 2012 16:16
>> To: scala-debate@googlegroups.com
>> Cc: Simon Ochsenreither
>> Subject: Re: [scala-debate] scala-macros vs scala-virtualized
>>
>>
>>
>> Am Mittwoch, 15. Februar 2012, 22:16:37 schrieb Simon Ochsenreither:
>>
>>
>>
>> > The name "macro" is a 100% guarantee for a marketing nightmare, so it
>>
>> > shouldn't be made worse by shipping it without any substantial example
>>
>> > which can be explained to developers.
>>
>>
>>
>> I can already imagine the headlines:
>>
>>
>>
>> "after the longest suicide note in history scala has finally managed to
>> fully erupt in insanity by adding #MACROS"
>>
>>
>>
>> Well... to be honest - the first thing I thought when I heard "#MACROS"
>> was "Oh dear - they won't do that - will they?". Now I know that we are
>> talking about some substantially different things than those macros I was
>> thinking about (flashback in the 80's... shudder).
>>
>>
>>
>> But to me "macro" surely has a real negative bias. And I think I am not
>> alone. So maybe its really worth thinking about names here. It won't be the
>> first time in history where something is shipped with a label that's not as
>> clear and technical as it could be (but most often for good reason -
>> marketing is also a part of a product).
>>
>>
>>
>> Greetings
>>
>> Bernd
>>
>>
>
>
>
>
> --
> Viktor Klang
>
> Akka Tech Lead
> Typesafe - The software stack for applications that scale
>
> Twitter: @viktorklang
>