16

I don't understand, how FP compilers make the code dealing with immutable data structures fast, not blow up stack, etc.

For example, insert operation in tree, it has to copy the whole tree before adding the new node and return the copied tree, versus the imperative couterpart that only needs to add a pointer to the new node. If the insert operation is run millions times, it would take a load of memory, and copying will be slower and slower when the tree is bigger. How do FP compilers actually optimize this ?

4 Answers 4

16

You don't have to copy the whole tree to make a change; you can share most of the structure. See e.g. the diagrams in this blog, or this talk by Rich Hickey on Clojure (see discussion of hash tries about halfway through).

Sign up to request clarification or add additional context in comments.

1 Comment

Diagram blog link is dead, if you can find another link please update your answer, thanks!
7

The compiler won't really optimize this, it is something you have to program for specifically when coding. Techniques for doing this are explained in the excellent Purely Functional Data Structures (book, thesis).

Comments

1

Take a look at the Zipper data structure.

Comments

-5

If the garbage collector is doing its job, the older copies of the data structure will be reclaimed when they're no longer being used.

2 Comments

May I ask why down-voting this answer ? assume I want to move the same set through some modifications and keep things functional, wouldn't this be helpful to know that GC will collect those parts that are no longer in use ?
It doesn't answer in any way to the question.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.