I have designed a SV-UVM framework initially for system simulation (for architecture evaluations).
I have been using clocking blocks and genereated synchronous inputs (and monitoring) and as far as I see such framework are somehow "based" on the idea of cycles (something happens every clock cycle or, at least the components check if there is something to do).
I am now asked to consider the effect of RANDOM DELAYS (with respect to the clock edge) of the signals coming to the frontend of the chip and also to consider clock jitters.
I am not an expert on this topics and I would like to get some advices on:
- first of all, is it reasonable to do it in a system verification framework or should they be taken into account at a different level?
- then, what it the best/recommended way of doing it (I guess the first step is getting rid of clocking blocks)?
- what impact would this have on simulation performances?
Thanks in advance to anybody.