That's something I've thought long about before...
It's difficult to make a distinction between a simple life form (or even a more complex one I guess) which is essentially a series of chemical reactions and electrical impulses, an organic computer if you want to look at it that way, and a man-made computer, which is constructed to turn electrical impulses into meaningful information... at what point does the computer become a thinking 'being' which is self aware by its own merits, not simply because it is programmed to act that way? As we all, in a sense, are? This has obviously perplexed many people, looking at the popularity of the 'Robot AI taking over the earth' SciFi scenario.
Also, don't think that the tree is killed, rather that the stump survives and grows a new 'top'... though I may be wrong.