All of the other contributors gave great answers, which work when you have a single dimension (leveled) list, however of the methods mentioned so far, only copy.deepcopy()
works to clone/copy a list and not have it point to the nested list
objects when you are working with multidimensional, nested lists (list of lists). While Felix Kling refers to it in his answer, there is a little bit more to the issue and possibly a workaround using built-ins that might prove a faster alternative to deepcopy
.
While new_list = old_list[:]
, copy.copy(old_list)'
and for Py3k old_list.copy()
work for single-leveled lists, they revert to pointing at the list
objects nested within the old_list
and the new_list
, and changes to one of the list
objects are perpetuated in the other.
Edit: New information brought to light
As was pointed out by both Aaron Hall and PM 2Ringusing
eval()
is not only a bad idea, it is also much slower thancopy.deepcopy()
.This means that for multidimensional lists, the only option is
copy.deepcopy()
. With that being said, it really isn't an option as the performance goes way south when you try to use it on a moderately sized multidimensional array. I tried totimeit
using a 42x42 array, not unheard of or even that large for bioinformatics applications, and I gave up on waiting for a response and just started typing my edit to this post.It would seem that the only real option then is to initialize multiple lists and work on them independently. If anyone has any other suggestions, for how to handle multidimensional list copying, it would be appreciated.
As others have stated, there are significant performance issues using the copy
module and copy.deepcopy
for multidimensional lists.