On Friday 16 October 2015 15:53:04 Deepa Dinamani wrote:
Forgot to do reply-all earlier.
Resending.
I considered using the ktime_get_seconds() earlier. However, I'm not convinced that the driver actually needs time in seconds. It would be hard to guess given that I'm not actually running this on a platform. So I used ktime_get() instead so that when the driver gets cleaned up later, it can be updated to use correct time granularity.
Given that sometimes they could end up calculating time from epoch seems to hint that there is more cleanup required on the driver.
Do you think this is reasonable?
The driver only uses the time in one place, which is the ioctl function returning the connection time in seconds, and this is a debugging interface.
From this, we know that there is little value in using a more accurate
time representation. Your approach gives us better rounding, but it won't really matter as the connection time for a wireless connection is not interesting in the low seconds anyway.
Regarding future cleanups, you should not introduce features just because you think they might be used later: Code is easy to change, so if someone needs the higher resolution, they can change it then. In this case, it's particularly unlikely to change, as the only user is in an ioctl. Making a change to that interface would break existing binaries, so we don't want that.
A more likely cleanup would be the removal of the debug interface, replacing it with something completely different.
Arnd