Imagine you are a data scientist at Amazon tasked with developing a neural network to categorize images of various chair types, such as "Office Chair" and "Dining Chair." Which activation function would you implement in the hidden layers of your neural network: ReLu or Tanh? Discuss the advantages of your chosen function over the other.