Dynamic function generation, HOW?

lisp

    Next

  • 1. macroexpand in cmucl
    Dear group, trying to understand nested backquotes, I define the following macro (from Paul Graham's On Lisp): (defmacro abbrev (short long) `(defmacro ,short (&rest args) `(,',long ,@args))) and I try to evaluate (macroexpand-1 '(abbrev mvb multiple-value-bind)) In sbcl (version 0.9.5) I get (DEFMACRO MVB (&REST ARGS) `(MULTIPLE-VALUE-BIND ,@ARGS)) which is what I expected. In cmucl (version 19c) I get: Error in format: No more arguments. ~:<~^~W~^~5I ~:_~W~3I ~:@_~W~1I~@{ ~:@_~W~}~:> ^ [Condition of type FORMAT::FORMAT-ERROR] Restarts: 0: [ABORT] Return to Top-Level. Debug (type H for help) ((FLET #:WITH-PRETTY-STREAM-6 PRETTY-PRINT::PPRINT-MULTIPLE-VALUE-BIND) #<pretty stream {58021915}>) Source: Error finding source: Error in function DEBUG::GET-FILE-TOP-LEVEL-FORM: Source file no longer exists: target:code/pprint.lisp. Why the error? Can anyone tell me what am I doing wrong? Thank you, Giorgos Pontikakis

Dynamic function generation, HOW?

Postby sankymoron » Sun, 08 Oct 2006 16:07:26 GMT

I am very facinated by the fact that lisp can generate functions on the
fly.  But what facinates me even more is where is this chunk of code in
the form of a new function allocated? on the heap, or on the stack, ...
There is something I am missing here.  Can someone please explain me
how this works?


Re: Dynamic function generation, HOW?

Postby rpw3 » Sun, 08 Oct 2006 19:59:50 GMT


+---------------
| I am very facinated by the fact that lisp can generate functions on the
| fly.  But what facinates me even more is where is this chunk of code in
| the form of a new function allocated? on the heap, or on the stack, ...
+---------------

On the heap.

Or rather, in one of the heaps, depending on how your specific
implementation's GC works. That is, there might be a heap for
code that is different for the heap used for conses, say.
Or code might go into a (sub)heap that uses mark-and-sweep GC
instead of 2-space-copying, to avoid the need for code
relocation at GC time. That sort of thing.


-Rob

-----
Rob Warnock			< XXXX@XXXXX.COM >
627 26th Avenue			<URL: http://www.**--****.com/ ;
San Mateo, CA 94403		(650)572-2607


Re: Dynamic function generation, HOW?

Postby sankymoron » Mon, 09 Oct 2006 02:48:52 GMT

Thanks Rob.  That makes sense.  I am usually uncomfortable when
programming in lisp because I cannot tell what the code is going to
look like at run-time.  Any suggestions for a book that explains the
compilation / interpretation of lisp code and the lisp runtime ?

sanket.







Re: Dynamic function generation, HOW?

Postby Barry Margolin » Mon, 09 Oct 2006 02:56:38 GMT

In article < XXXX@XXXXX.COM >,




Although you *can* generate code on the fly in Lisp, it's not a very 
common thing to do.

The Lisp runtime accomplishes it basically by having a built-in compiler 
and dynamic linker.

-- 
Barry Margolin,  XXXX@XXXXX.COM 
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
*** PLEASE don't copy me on replies, I'll read them in the group ***

Re: Dynamic function generation, HOW?

Postby rpw3 » Mon, 09 Oct 2006 17:19:18 GMT


+---------------
| I am usually uncomfortable when programming in lisp because I
| cannot tell what the code is going to look like at run-time.
+---------------

Are you "uncomfortable when programming in C or C++ because you
cannot tell what the code is going to look like at run-time"?
If not, then why are you uncomfortable when programming in Lisp?

If so, then what do you do about it in C or C++? Answer: Probably
the same thing people do in Lisp, that is, write a small test case
and see what it compiles into:

    > (defun +/2-arg/fast (a b)
	(declare (fixnum a b)
		 (optimize (speed 3) (safety 0)))
	(the fixnum (+ a b)))

    +/2-ARG/FAST
    > (compile *)
    ; Compiling LAMBDA (A B): 
    ; Compiling Top-Level Form: 

    +/2-ARG/FAST
    NIL
    NIL
    > (disassemble *)
    4850A288:  .ENTRY +/2-ARG/FAST(a b) ; (FUNCTION (FIXNUM FIXNUM) FIXNUM)
	  A0:  POP   DWORD PTR [EBP-8]
	  A3:  LEA   ESP, [EBP-32]
	  A6:  ADD   EDX, EDI           ; No-arg-parsing entry point
	  A8:  MOV   ECX, [EBP-8]
	  AB:  MOV   EAX, [EBP-4]
	  AE:  ADD   ECX, 2
	  B1:  MOV   ESP, EBP
	  B3:  MOV   EBP, EAX
	  B5:  JMP   ECX
	  B7:  NOP
    > 


-Rob

-----
Rob Warnock			< XXXX@XXXXX.COM >
627 26th Avenue			<URL: http://www.**--****.com/ ;
San Mateo, CA 94403		(650)572-2607


Re: Dynamic function generation, HOW?

Postby sankymoron » Mon, 09 Oct 2006 19:17:25 GMT

When programming in C/C++, atleast I have a conceptual idea about how
function calls and stacks and heap etc. work.

Its just that I would like to know how things like passing and
returning functions as data, returning multiple values etc happen at
the low level.

But thanks for (disassemble *).  I didn't know about it.  will
definitely use it from now on.

sanket.







Re: Dynamic function generation, HOW?

Postby Lars Rune Ntdal » Mon, 09 Oct 2006 21:18:59 GMT




You might find L.i.S.P. interesting:
 http://www.**--****.com/ 





-- 
Lars Rune Nstdal
 http://www.**--****.com/ 


Re: Dynamic function generation, HOW?

Postby Pascal Bourguignon » Mon, 09 Oct 2006 22:55:29 GMT

 XXXX@XXXXX.COM  writes:


How many programming languages do you know that provide their own one
page long interpreter?  What better conceptual idea can you get than
reading the source of EVAL?

Read SICP!
 http://www.**--****.com/ 
(cf Chapter 4)

-- 
__Pascal Bourguignon__                      http://www.**--****.com/ 

ADVISORY: There is an extremely small but nonzero chance that,
through a process known as "tunneling," this product may
spontaneously disappear from its present location and reappear at
any random place in the universe, including your neighbor's
domicile. The manufacturer will not be responsible for any damages
or inconveniences that may result.

Re: Dynamic function generation, HOW?

Postby sankymoron » Tue, 10 Oct 2006 06:12:21 GMT


Thanks! Thats exactly what I was looking for.

sanket.


Re: Dynamic function generation, HOW?

Postby rpw3 » Tue, 10 Oct 2006 12:22:03 GMT

lt; XXXX@XXXXX.COM > wrote:
+---------------
| When programming in C/C++, atleast I have a conceptual
| idea about how function calls and stacks and heap etc. work.
+---------------

There's an old in-joke that goes: "C programmers know the cost
of everything and the value of nothing; Lisp programmers know
the value of everything and the cost of nothing."[1]

In the immortal words of Dennis Ritchie, "You are not expected
to understand this."[2] At least, not quite yet. ;-} ;-}
But if you keep going with Lisp, you will, eventually.

+---------------
| Its just that I would like to know how things like passing and
| returning functions as data, returning multiple values etc happen
| at the low level.
+---------------

As reasonable-sounding as that question may seem to you, the
reason you aren't going to get a simple answer to it is that
almost every Lisp system does it slightly (or *completely*!!)
differently, often (but not always) for completely valid reasons
having to do with tradeoffs the implementation has made between
performance, cross-platform portability, need (or not) for ease
of FFI with other languages, style of GC, and many, many other
design dimensions.

As others have advised, books such as Queinnec's "Lisp In Small
Pieces", or SICP, or Norvig's PAIP -- especially the chapters
in each on compiling and virtual machine design -- are well
worth the effort of studying if you're really serious about
the subject.

Object representations are one of the areas where various Lisp
implementations vary most widely. David Gudeman's 1993 (but still
very useful) survey of object representations & tagging is well
worth reading:

ftp://ftp.cs.indiana.edu/pub/scheme-repository/doc/pubs/typeinfo.ps.gz
"Representing Type Information in Dynamically Typed Languages"

But if you just want a rough mental performance model, you won't go
very wrong if you consider that each of these actions [at least, in
compiled code] has a roughly the same (small) cost:

- A function call.
- A dynamic (special) binding.
- Allocating a heap object ("consing"). [Number of allocations
is usually much more important than object size. Usually.]
- One iteration of a loop.
- The READ'ing of one token [if READ is used at run-time].

Of course, evaluating in-lined accessor functions [when the
compiler has been given enough information to do so] is cheaper
than full-calling a generic version, but you can/should ignore
that for now.

In fact, an even simpler approximation [still surprisingly
accurate] is:

- A function call has fixed cost.
- *Everything* is a function call. ;-}

And at this stage, you're probably better off using the built-in
TIME macro than DISASSEMBLE to figure out how long things take.
Probably the very first thing you're going to discover is that
in most implementations there's a *LOT* of difference between
running code interpreted and compiled:

> (defun delay (n)
(dotimes (i n)))

DELAY
> (time (delay 1000000))
; Compiling LAMBDA NIL:
; Compiling Top-Level Form:

; Evaluation took:
; 3.97f0 seconds of real time
; 3.917419f0 seconds of user run time
; 0.00228f0 seconds of system run time
; 7,364,217,098 CPU cycles
; [Run times include 0.16f0 seconds GC run time]
; 0 page faults and
; 48,027,344 bytes consed.
;
NIL
>

Yikes! 7364 CPU clock cycles per iterati

Re: Dynamic function generation, HOW?

Postby Luigi Panzeri » Tue, 10 Oct 2006 19:20:39 GMT

Pascal Bourguignon < XXXX@XXXXX.COM > writes:


You can also watch video lessons:

 http://www.**--****.com/ 

Re: Dynamic function generation, HOW?

Postby tar » Wed, 11 Oct 2006 02:08:57 GMT

 XXXX@XXXXX.COM  writes:


I think this is the wrong approach.  What you need to do is understand
the conceptual model of what the lisp semantics imply.  In essence you
need to understand the virtual machine that lisp and its {*filter*}tics
provide.  Understanding the implementation details is generally
unnecessary and often counter-productive, especially since the actual
implementation can vary depending on the lisp vendor and even the
underlying architecture.

For example, the way to understand cons cells and lists is at the level
of the box-and-pointer diagrams, not in terms of the layout of a cons
cell in memory.  The latter can be done in one of several ways, and
depending on OS and hardware, has not always been done the same way.
Sometimes an entire cons cell fits into a single machine word, sometimes
it takes two such words.  Sometimes the type information is encoded in
the lower order bits, other times not.  But knowing that doesn't help
you understand the important notions of structure sharing and other list
properties the way the box-and-pointer view does.

So, to answer the original question:  It doesn't matter where lisp
allocates the function code's storage.  All you need to know is that the
code is sufficiently persistent that it can be used when in-scope and
accessible.  It may be on the heap, it may be on the stack (given
appropriate declarations), it may be in some other data structure.


-- 
Thomas A. Russ,  USC/Information Sciences Institute

Re: Dynamic function generation, HOW?

Postby sankymoron » Wed, 11 Oct 2006 05:40:42 GMT

Thank you all for the informative replies!

I will try to concentrate more on the conceptual model of lisp, and not
the low level details.

And will definitely check out (time) and (disassemble) to understand
performance of my code.

One question though:  I have heard that lisp code can be compiled into
a stand-alone x86 executable.  But how does the GC work then?  In C# or
Java for example, code is compiled to the language of the underlying
runtime (bytecodes).  And the runtime takes care of GC.  So does a lisp
stand-alone executable need any run-time, or does it have GC code
built-in?


...off to becoming a lisp fan.

sanket.


Re: Dynamic function generation, HOW?

Postby Pascal Bourguignon » Wed, 11 Oct 2006 06:02:33 GMT

 XXXX@XXXXX.COM  writes:


You are still worrying about low level stuff.  It just happens!



Now, the short answer is that yes, there's a run-time library, like in
C or any other programming language (but possibly barebones
assembler).


A slightly longer answer would be that there are broadly two kinds of
garbage collectors: precise garbage collectors and conservative
garbage collectors.

Conservatives garbage collectors scan the memory and whatever bit
pattern that looks like a pointer is considered to be a pointer to
some live memory block.  These GC don't need to know anything about
the program or the type of the data, but some memory might not be
collected just because there's some random bit pattern in the memory
that looks like a pointer to it.  See for example BoehmGC. 

Precise garbage collectors on the other hand need to know the type of
the data to be able to locate precisely the pointers.  This can be
done statically or dynamically, both with the help of the compiler.
Dynamically, the type of the data is recorded along with the data: we
have type tags.

For the long answer, read the sources of various implementations.


-- 
__Pascal Bourguignon__                      http://www.**--****.com/ 

CONSUMER NOTICE: Because of the "uncertainty principle," it is
impossible for the consumer to simultaneously know both the precise
location and velocity of this product.

Re: Dynamic function generation, HOW?

Postby Barry Margolin » Wed, 11 Oct 2006 12:07:37 GMT

In article < XXXX@XXXXX.COM >,




Of course it needs a run-time library.  Forget about GC, where do you 
think all the built-in functions like OPEN, MEMBER, etc. live?

The CONS function is also in the runtime library (just as malloc() is in 
the C runtime library), and GC is just part of the implementation of 
that function (as well as other object allocators like MAKE-ARRAY, 
MAKE-INSTANCE, etc.).

-- 
Barry Margolin,  XXXX@XXXXX.COM 
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
*** PLEASE don't copy me on replies, I'll read them in the group ***

Similar Threads:

1.Dynamic function generation

Dear list,

I'm trying to dynamically generate functions; it seems that what I
really want is beyond C itself, but I'd like to be confirmed here.

In the minimal example below, I'd like to create content to put at
the address pointed to by f. In particular, I'd like to avoid/replace
the memcpy line.

Possible application (inspired by Paul Graham, "ANSI Common Lisp",
page 2): write a function that takes a number n, and returns a function
that adds n to its argument.

Any comments would be much appreciated.

Thanks,
Marco


// BEGIN CODE
#include <stdio.h>

int main(void) {
  int (*f)(const char *f, ...);
  int n;
  f = printf;
  n = sizeof(int(*)(const char *, ...));
  f("Hi! My size is %d.\n", n);
  f = (int(*)(const char *f, ...))malloc(n+1);
  memcpy(f, printf, n+1);
  if (f == (int(*)(const char*, ...))NULL)
    printf("Mem full!\n");
  f("Hello, World!");
  free(f);
  return 0;
}
// END CODE



2.[Fwd: Vhdl dynamic generation]

-------- Original Message --------
Subject: Vhdl dynamic generation
Date: Wed, 25 Jun 2003 09:22:37 -0700 (PDT)
From: silicia mando

Hello,

Besides "Generic" and "Global" variable concepts,
do you think that VHDL is suitable for "dynamic"
parametrization?

Let's take the following examples:

if implementing a conventional FIR filter with known
input range, then we can optimally pre-calculate the
minimum wordldength at each FIR node. These optimal
values should be preferably used when implementing
this filter.

However, can we set (from the same synthesiser
environment ) these wordlengths  dynamically when
generating the VHDL code for this FIR for any input
range values ?.

Wondering if we can call a "function" that allows us
to undertake mathematical computation before setting
the wordlength at each node.

If not, i can think on generating a text file where
all the nodes wordlength are included. This file will
be used in setting the internal FIR wordlength?

But as i said, i am looking to  work in the same
environement, without calling or running another
software


Things might seem harder if we want to generate a core
for flt2 cascaded to flt1, and i want to use solely
the same synthesiser without calling in other
language.


i expect this is not possible since most of the core
generator has front end NOT using VHDL, especially
when considering that a function call will return a
value and not a parametrised vhdl code

if i am wrong, i will be glad to have hints from you.

In summary i am wondering on the following issue:

could we have such call

VHDL_CODE=
cascade_call_function(flt1[Coef1],flt2[Coef2])

using the same synthesiser

how this code can be inserted DYNAMICALLY in other
vhdl code project?
is it better to develop  a user library, so the user
can set these coefficients in the call instantiation ?

say i develop a filter with array input at the entity
this array contains the filter coefficient. so when
calling this filter i just send the appropriate
coefficients

sorry if my question is so confusing

Thanks for your time


Kind Regards

3.dynamic variables generation

Hello;

I've been going through the forums and found no satisfying answer:

I want to generate on the fly variable names such as

$myprefix_1
$myprefix_2
$myprefix_3
$myprefix_4
...

using a for construct ... how can I build and then access them in
another for construct  (tryed $myprefix_$i  ... not satisfying)


thanks!

4.dynamic control generation - visual tcl

hi-

i am looking for the trick to dynamically with programming code create a
control in visual tcl development environment.
i use place geometry manager.  if anyone has done this and can share with me
the secret,
please respond.  i'd like the control to appear in a toplevel or frame.
usually i drag and drop and use what is available
but that is not an option for what i must do now.

thank you,
marvin


5.kirby g4 g5 generation 4 generation 5 bag

6. Newbie function generation

7. Random generation in function

8. Automatic generation of function's uses and dependences (graph representation)



Return to lisp

 

Who is online

Users browsing this forum: No registered users and 93 guest