functional programming vs pthonic list comprehension | Sololearn: Learn to code for FREE!

+15

# functional programming vs pthonic list comprehension

I see filter and map covered by list comprehension: [f(i) for i in list] [i for i in list if f(i)] But how about reduce ? there a pythonic way?

5/2/2020 9:02:19 AM

Oma Falk

+10

Tibor Santa dont even think about deletingš”š”š

+8

Very cool :D There's something interesting to be said about reducing into a list: As we know reduce takes a binary operation that makes two values into one and then does it over and over again until the list is reduced to a single value. Reducing by `a+b` obviously takes a list of numbers and sums it. But also, after we reduce, we don't know which numbers went into the final sum and in that sense we lose information! Reducing by `a.append(b)` literally just chucks all the numbers into a list and all the information about what elements came in is preserved which makes reducing into a list special :) Because of this we even give lists a special name, it's the "free monoid". And a monoid is just a type and a binary operation like we had it above. (int, a+b) is a monoid, so is (int, a*b), and (string, a+b), and the free monoid (List<Foo>, a.concat(b))

+7

Schindlabua I see you like functional programming. Do you work professionally with it? Is anyone using it professionally, in large scale? Besides āpurenessā of leaving no temp variable behind, is there any other advantage? In python the classic for next while if else usually wins timing contest against fp. And in my view code is way more difficult to read and maintain. Am I wrong?

+6

or zip, or lambda, or... I donāt think functional programming can be replaced completely by list comprehension. Even not liking functional programming :P

+6

Schindlabua actually you can make reduce return a list too, although it is not the typical usage :) https://code.sololearn.com/cgU9nid0Cf1a/?ref=app But it is very different from a list comprehension either way.

+6

Oma Falk I could think of something like this, to use list comprehension to emulate a reduction. https://code.sololearn.com/cOxfdsJ2fEf4/?ref=app But it is really ugly and unpythonic, I don't even want to make it public. My hand itches for the Delete button. Maybe I can figure out something more functional later :)

+6

+6

Edward nono...very interesting aspects

+5

Schindlabua right but Guido said he doesnt like map and filter...since LiCo can do it and is more pythonic. Actually my question was based on this. I will edit.

+5

Schindlabua linear algebra for runaways? Booahh ...cool

+5

This is a tiny bit better as I store the intermittent state in instance variable, instead of global... https://code.sololearn.com/cY7wQuSpbx45/?ref=app

+5

Edward Just to illustrate how big of a deal pureness is: It means that pure functional code can be trivially run on multiple cores or machines because we know for sure that there is no shared state and nothing that can cause deadlocks. (hence Erlang for networking, it's code that is massively parallel) Pureness also implies that we can derive certain facts about functions only from looking at the type. Take a function `foo` that takes an arbitrary type `T` and returns another `T`. In Haskell syntax: foo :: t -> t Just by the fact that the type is generic and the fact that functions are pure, we *know* that the only possible implementation of foo is: foo x = x (That is, the function does nothing and returns what we plugged in). And in theory the compiler can know this too and produce more optimized code than a C compiler ever could by optimizing this function away entirely for example. But I hate co-opting threads and go off-topic so yall can drop me a DM if you want to know more about anything :D

+5

Schindlabua Edward Please do continue the thread, I find it very educational. Oma Falk agrees

+5

The most common use for reduce is to calculate the hash code of an object that has some iterable object attribute i.e: from functools import reduce from operation import xor def __hash__(self): hashes = map(hash, self._list) #1 return reduce(xor, hashes,0) #2 #1: generator with map to lazily compute the hash of each element of the list or iterable. #2: operation.xor could be a lambda function but it's more readable. Reduce is used (wrong) also for computing the sum of all the elements of a list, when we have the sum() built-in function that does exactly that but much more readable and simple. Answering your question, i would say that as map, filter, reduce return a generator the common Pythonic way to substitute these is by generator expressions, if you want to save memory and work lazily or with list compr if you want a list directly. Anyways, I would recommend both for their readability, this is what Python pretended since its begining: powerful but simple and readable at the same time.

+5

Also, the current map and filter in python3 are much superior to a List comprehension, because they are lazy. You can use them with infinite sequences. Of course you can also write a generator expression instead of list com. just by changing the brackets to parens. And have the same effect. Reduce in python is not lazy. So that is a huge drawback. I also think there is an aesthetic value and readability benefit in map and filter. For example compare: filter(bool, iterable) (n for n in iterable if n) I find the first form much easier to read, even though the effect is exactly the same. I also disagree with Guido's argument about associativity. I dipped my hand in Haskell a bit too and there we have foldr and foldl functions that are similar to reduce, and the difference between the two is the direction they process the list, starting from left or right. Associativity of the function (operator) plays a huge role there. Python reduce is really simplified compared to that.

+4

Functional programming is more than just map/reduce, just saying :P List comprehensions can't possibly implement reduce, because reduce takes a list and returns a non-list. And list comprehensions always give you lists.

+4