I recently planned and build a synthesizer with the si564AAA which is controlled by an arduino over the i2c bus. Everything is working as it should, only the output levels of clock+ and clock- are not as expected. I can only meassure 5 mVpp, but it should be 800 mVpp or 566 mVeff (LVPECL should be low 1.6 V, high 2.4 V, difference should be 0.8 Volts). Clock output pins are connected to 130 ohm resistors to ground and with a 1nF capacitor to a 50 Ohm termination as recommendet in datasheet.
Could you please tell me where i am going wrong?
With best regards, Werner Pichl, OE7WPA
I want to use the SI5386A device to drive DCLK & SYSREF for a JESD204B ADC device.
I don't plan to use LTE or other cellular frequencies.
My input clock frequency is a clean 125MHz from an OXCO, and the output DCLK for the ADC is 500MHz.
Jitter performance design goal should be less than 150fsec.
Can I use different TXCO reference clock frequency other than 54MHz? Will it be OK to use a high performance 50MHz TCXO instead?
I would like to know how to understand the 0 ppm error feature stated in several datasheets. For example the Si5351 datasheet states "Exact frequency synthesis at each output (0 ppm error)" while the Si5341/40 datasheet mentions " ... sub-100 fs rms phase jitter performance with 0 ppm error". To the best of my knowledge it is not elaborated on elsewhere in the documents what this exactly means. Does this 0 ppm error statement just cover the precision of the PLL/Multisynth multiplication factors or does it really tell something about the stability of the synthesized output clocks.
Let's assume the Si53xx clock generators is fed by a high stability external clock with a given Allan deviation (say ADEV = 1e-11) and this clock is then multiplied with a given factor (say 10). Does this 0 ppm error statement imply that the generated output clock has the same stability as the input clock? If so, over what measurement time, i.e. are there short term instabilities? What is the dependency on for example temperature or power supply noise?
ive read the data sheet countless times and ive tried many things but i am stuck.
it is only when setting register 0x07 over i2c where there is a problem. in the data sheet it outlines to set this to 0x08 to enable FCAL to output the new frequency (the 3rd bit in the register). I send this command, check it on the scope and the wave forms are in fact <slave_addr+ack><reg_addr+ack><0x08+ack>
but when i read back the register, it is set to 0 and there is no output. all other registers are read back as what they were set. I also get an errror writing to register 0x07 when i try to write the word 0x80 (for Power Off Reset -general reset command). where i get a null pointer error to the address. Any help??
The 100MHz reference clock that I am using is passing all the jitter tests on the PCIe Clock Tool. The only test that it is failing is for rising/falling edge rate. Could this failure be due to setup? I am attaching the test report for your reference.
Pls elucidate this failure.
Hi, I would like to use the Si5338 as part of a low-update-rate control loop. The CLK0 output frequency will be adjusted on the fly over I2C.
If I send an I2C block write to the MS1 registers (addresses 53:62 decimal) while CLK0 is active, will it transition smoothly to the new frequency without glitches?
I am using ClockBuilder Pro to make a frequency plan for the Si5340.
I went through every configuration step without any warnings or errors, but I noticed that on the plan summary the fvco is 13.2GHz.
On the Si5340 datasheet on table 5.8 the fvco is specified between 13.5GHz to 14.4GHz.
Is it ok to run the vco outside its frequency limits?
Also, ClockBuilder Pro says that if the DCO range is more than 350ppm the output jitter may be higher. How bad does it get? does anybody have a rough figure for it?
N0: DCO Enabled
Fvco: 13.2 GHz
Step Word: 865073
Desired Step Size: 2 ppm
Actual Step Size: 1.999998913724... ppm
Range: 349 ppm
Initial Freq: 56 kHz
Step Size: 0.111999939168... Hz
Min Freq: 55.98051882899... kHz
Max Freq: 56.019494734589... kHz
Hi, I'm trying to figure out why I'm seeing such a big discrepancy between the ClockBuilder power estimates and the values that are measured by the EVB, particularly the +15% delta on VDD.
SI5341-D-EVB power from USB
Board at room temperature
All jumpers in default position
No loads connected to clock outputs
ClockBuilder Pro v2.29
|VDD:||1.8 V||Ta:||70 °C||Airflow:||None|
|VDD||N/A||N/A||1.8 V||80.8 mA||145.5 mW||1.799 V||93.0 mA||167.3 mW||15.0 %|
|VDDA||N/A||N/A||3.3 V||117.4 mA||387.3 mW||3.321 V||120.0 mA||398.5 mW||2.9 %|
|VDD0||100 MHz||LVDS||3.3 V||15.6 mA||51.4 mW||3.309 V||16.0 mA||52.9 mW||3.1 %|
|VDD1||50 MHz||LVCMOS||3.3 V||16.3 mA||53.8 mW||3.313 V||16.0 mA||53.0 mW||-1.5 %|
|VDD2||1 MHz||LVCMOS||3.3 V||13.6 mA||44.7 mW||3.292 V||7.0 mA||23.0 mW||-48.5 %|
|VDD3||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD4||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD5||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD6||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD7||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD8||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|VDD9||Unused||N/A||0.0 V||0.0 mA||0.0 mW||0.000 V||0.0 mA||0.0 mW||---|
|Total||244 mA||683 mW||252 mA||695 mW||1.8 %|
Furthermore, the DC Characteristics shown on page 19 of the datasheet show IDD max as twice what IDD typical is. I'd like to be able to see something similar in the estimate in ClockBuilder. If I vary the ambient temp between -55 and +85 in CB, I only see a change in IDD of about 20mA, not the 115mA seen in the datasheet. Is there a way to alter the VDD and VDDA input voltages in ClockBuilder?
I have a requirement to program a SI5380A-D-GM BASE part that is driven by a 48 MHz crystal. The In0-In3 pins are unconnected.
When I use the ClockBuilder Pro (V2.30.1) to build a register set, it only provides a reference XTAL selection of 54 MHz. How do I change the reference to 48 MHz to match my configuration.
For an Outdoor IP Radio project, We need to connect the Synchronized Clock from the DPLL to Ethernet Switch, and a Synchronous Clock from Ethernet Switch to DPLL. But the operating voltage of the two devices(Ethernet Switch and DPLL) are different So we need to add a Clock buffer/level translator, So we are selected the Si53307 from Silabs. Synchronized clock(SECLK) frequency is 25MHz and Synchronous Clock(SCLK) is a 125MHz signal
So our Questions are
Is the Si53307 supports both 3.3V to 1.8V(DPLL to Ethernet Switch) and 1.8V to 3.3V(Switch to DPLL)?
What will be the maximum propagation delay on the two cases(DPLL to Ethernet Switch and Ethernet Switch to DPLL)?
Is the Propagation delay is fixed or varying?
If it is Varying means please mention any device with a fixed propagation delay if it is available.
And If it is device not Suited means suggest some other Clock Buffer/Level translator from Silab for this Application
I am planning to use the Si5319 to clean a 56kHz jittery clock, without any frequency translation (fin=fout).
Using DSPLLsim I can see that setting:
gives me a possible input frequency in the range of: CKIN1 Min: 0.049897 Max: 0.058333.
I can select the bandwidth to meet the Si5319 jitter tolerance.
However my clock slowly (1Hz/s) drifts around the 56kHz center frequency (but still within the allowed frequency input range above).
In the Si53xx Family Reference Manual at chapter 6.2.3. it says that if CKIN changes more 500ppm the device may initiate self-calibration. And that during self calibration the output clock can vary ±20%.
Is the 500ppm threshold a consistent design parameter, or is it a variable threshold that might happen only on certain devices?
Are there any workarounds?
Thanks for your help
Is there any replacement for obsolete SiLbas PN # TS3005ITD1033TP?
Thank you and Best Regards,