first post on math.stack exchange so feel free to eviscerate me for asking this question that has probably been asked. I'm just not sure how to describe this enough to find it elsewhere... or perhaps it hasn't been asked.
I have a line y=x that represents the possible means of any two random variables M1 and M2. For constant (z), what are the two equations that can be used to calculate the range around y which represent log2(x) = z?
I'm not 100% sure if this question is phrased properly, so I've draw a couple of pictures to help with what I'm trying to ask. Bascially, I've got two measurements without replicates that I'm building a simple tool to investigate. The user sets a cutoff threshold (say 1.2 for example) and for any sample that shows a difference of greater than abs(log2(1.2)), a flag is raised.
This is a diagram of what I mean. I'm looking to calculate 'a' and 'b' based on 'z', which in this diagram is equal to 1 (which therefore represents a log2fold change between 'a' and 'b' of 1. Diagram of the measurements I am seeking
This is time series genetic data, so when I want to plot the results, for each gene I have a plot with two lines. What I've done is calculate the mean of the two samples, and then I want to plot the range (as error bars) around the mean that represents the abs(log2(1.2)) range. If the gene is flagged, the points will sit outside of the error bars.
I've tried to solve this with some linear algebra, but the results I'm getting don't make sense. I image a y=x line, with a line above and blow that grow in size based on the variable z.
I expect this plot to look like the end of a trumpet - where y=x is in the middle, but I'm not entirely sure. What I do know is that as the mean grows, so should the range around the mean. Like so: What I think the plot should look like maybe here
I am more than happy to clarify what I'm looking for. And I would sincerely appreciate any help getting my head sorted out on this.
Cheers