For best / most accurate results, I would imagine you will need some automatic solder wire feeding station. - At least to maintain consistency.
Is this something you have on-site to perform the test with?
- Indeed, a pretty straight forward process. We have done many in our lab testing different brand alloys - however all tests are ideally performed with some sort of automated feed for test to test consistency.
Lastly, (and perhaps I should have mentioned earlier) - Flux is flammable. As such, having the temperature too high you will definitely see an increase solder ball counts. This is because of the out-gassing occurring within the flux cored wire. The gas created from the flux (igniting in this instance) is what is creating the flux splatter and solder ball defects.
To reduce: Try to solder with a temperature closest to the liquidus state of your alloy. (lowest possible - assuming all parts are eutectic) - if the alloy is not eutectic, you will need to target the highest liquidous temp in the composition. (Eg. SAC305 - the 3 part composition (tin, copper, and silver) all have 3 different liquidous temperatures -> meaning its non-eutectic. You would want to stay as close to 350°C as possible or (if a slightly off version of SAC305) target the highest melting point of your chemistry) - assuming your temp controller is properly calibrated that is.
Other options to reduce these defects during production would follow my original suggestion: try implementing a perforation gearing system to the process. - There are 2 main types: V-Score (Like a pizza cutter) and Drilling (actually drills holes into the wire using a serrated gear head). Depending upon alloy and diameter, 1 technology tends to perform better over the other.
Wishing you the best of luck!
PS The PDF Version worked!
reply »