One of the challenges in working with 3D lighting in After Effects is that the intensity of a light doesn't diminish as the distance from the light increases. In the physical world, light intensity falls off as the square of the distance from the light. What we'd like to do is come up with a convincing simulation for this natural phenomenon. It should have variables that allow us to specify how fast the light falls off with distance and how close the layer has to be to the light before there is no falloff.
When you convert a layer to 3D, it picks up a bunch of new properties having to do with how the layer responds to the lighting in the scene. We will use two of these properties, "Diffuse" and "Specular", to create our simulation. We'll apply our expression to both of these properties. For simplicity, we'll assume that there is only one light in the scene. We'll first calculate how far our layer is from the light. If it's within the "no falloff" threshold, we do nothing. If it's beyond the threshold, we'll diminish the value of the parameter exponentially as the distance increases, using Math.exp() to do the calculation.
decay = .005;
noFalloff = 200;
L = thisComp.layer("Light 1");
d = length(L.transform.position,transform.position);
if (d < noFalloff){
value;
}else{
value/Math.exp((d - noFalloff)*decay);
}